Skip to content
Snippets Groups Projects
Commit a257e5b6 authored by aurelien.bellet's avatar aurelien.bellet
Browse files

introduce label distribution skew in sec 2

parent 8c9e245a
No related branches found
No related tags found
No related merge requests found
...@@ -11,8 +11,17 @@ labeled data point by a tuple $(x,y)$ where $x$ represents the data point ...@@ -11,8 +11,17 @@ labeled data point by a tuple $(x,y)$ where $x$ represents the data point
Each Each
node has node has
access to a local dataset that access to a local dataset that
follows its own local distribution $D_i$. The goal is to find the parameters follows its own local distribution $D_i$ which may differ from that of other
$\theta$ of a global model that performs well on the union of the local nodes.
In this work, we focus on label distribution skew: denoting by $p_i(x,y)=p_i
(x|y)p_i(y)$ the
probability of $(x,y)$ under the local distribution $D_i$ of node $i$, we
assume that $p_i(y)$ varies across nodes. We refer to
\cite{kairouz2019advances,quagmire} for concrete examples of problems
with label distribution skew.
The objective is to find the parameters
$\theta$ of a global model that performs well on the union of the local
distributions by distributions by
minimizing minimizing
the average training loss: the average training loss:
...@@ -26,8 +35,10 @@ function ...@@ -26,8 +35,10 @@ function
on node $i$. Therefore, $\mathds{E}_{(x_i,y_i) \sim D_i} F_i(\theta;x_i,y_i)$ on node $i$. Therefore, $\mathds{E}_{(x_i,y_i) \sim D_i} F_i(\theta;x_i,y_i)$
denotes denotes
the the
expected loss of model $\theta$ over the local data distribution expected loss of model $\theta$ over $D_i$.
$D_i$.
To collaboratively solve Problem \eqref{eq:dist-optimization-problem}, each To collaboratively solve Problem \eqref{eq:dist-optimization-problem}, each
node can exchange messages with its neighbors in an undirected network graph node can exchange messages with its neighbors in an undirected network graph
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment