A survey on federated learning for resource-constrained IoT devices

A Imteaj, U Thakker, S Wang, J Li… - IEEE Internet of Things …, 2021 - ieeexplore.ieee.org
Federated learning (FL) is a distributed machine learning strategy that generates a global
model by learning from multiple decentralized edge clients. FL enables on-device training …

A critical review on the use (and misuse) of differential privacy in machine learning

A Blanco-Justicia, D Sánchez, J Domingo-Ferrer… - ACM Computing …, 2022 - dl.acm.org
We review the use of differential privacy (DP) for privacy protection in machine learning
(ML). We show that, driven by the aim of preserving the accuracy of the learned models, DP …

Federated multi-task learning under a mixture of distributions

O Marfoq, G Neglia, A Bellet… - Advances in Neural …, 2021 - proceedings.neurips.cc
The increasing size of data generated by smartphones and IoT devices motivated the
development of Federated Learning (FL), a framework for on-device collaborative training of …

Proxskip: Yes! local gradient steps provably lead to communication acceleration! finally!

K Mishchenko, G Malinovsky, S Stich… - International …, 2022 - proceedings.mlr.press
We introduce ProxSkip—a surprisingly simple and provably efficient method for minimizing
the sum of a smooth ($ f $) and an expensive nonsmooth proximable ($\psi $) function. The …

Advances and open problems in federated learning

P Kairouz, HB McMahan, B Avent… - … and trends® in …, 2021 - nowpublishers.com
Federated learning (FL) is a machine learning setting where many clients (eg, mobile
devices or whole organizations) collaboratively train a model under the orchestration of a …

A unified theory of decentralized sgd with changing topology and local updates

A Koloskova, N Loizou, S Boreiri… - International …, 2020 - proceedings.mlr.press
Decentralized stochastic optimization methods have gained a lot of attention recently, mainly
because of their cheap per iteration cost, data locality, and their communication-efficiency. In …

The non-iid data quagmire of decentralized machine learning

K Hsieh, A Phanishayee, O Mutlu… - … on Machine Learning, 2020 - proceedings.mlr.press
Many large-scale machine learning (ML) applications need to perform decentralized
learning over datasets generated at different devices and locations. Such datasets pose a …

Decentralized stochastic optimization and gossip algorithms with compressed communication

A Koloskova, S Stich, M Jaggi - International Conference on …, 2019 - proceedings.mlr.press
We consider decentralized stochastic optimization with the objective function (eg data
samples for machine learning tasks) being distributed over n machines that can only …

On the linear speedup analysis of communication efficient momentum SGD for distributed non-convex optimization

H Yu, R Jin, S Yang - International Conference on Machine …, 2019 - proceedings.mlr.press
Recent developments on large-scale distributed machine learning applications, eg, deep
neural networks, benefit enormously from the advances in distributed non-convex …

Decentralized training of foundation models in heterogeneous environments

B Yuan, Y He, J Davis, T Zhang… - Advances in …, 2022 - proceedings.neurips.cc
Training foundation models, such as GPT-3 and PaLM, can be extremely expensive, often
involving tens of thousands of GPUs running continuously for months. These models are …