Communication-efficient distributed learning: An overview

X Cao, T Başar, S Diggavi, YC Eldar… - IEEE journal on …, 2023 - ieeexplore.ieee.org
Distributed learning is envisioned as the bedrock of next-generation intelligent networks,
where intelligent agents, such as mobile devices, robots, and sensors, exchange information …

Communication efficiency in federated learning: Achievements and challenges

O Shahid, S Pouriyeh, RM Parizi, QZ Sheng… - arXiv preprint arXiv …, 2021 - arxiv.org
Federated Learning (FL) is known to perform Machine Learning tasks in a distributed
manner. Over the years, this has become an emerging technology especially with various …

Advances and open problems in federated learning

P Kairouz, HB McMahan, B Avent… - … and trends® in …, 2021 - nowpublishers.com
Federated learning (FL) is a machine learning setting where many clients (eg, mobile
devices or whole organizations) collaboratively train a model under the orchestration of a …

Clustered federated learning: Model-agnostic distributed multitask optimization under privacy constraints

F Sattler, KR Müller, W Samek - IEEE transactions on neural …, 2020 - ieeexplore.ieee.org
Federated learning (FL) is currently the most widely adopted framework for collaborative
training of (deep) machine learning models under privacy constraints. Albeit its popularity, it …

A unified theory of decentralized sgd with changing topology and local updates

A Koloskova, N Loizou, S Boreiri… - International …, 2020 - proceedings.mlr.press
Decentralized stochastic optimization methods have gained a lot of attention recently, mainly
because of their cheap per iteration cost, data locality, and their communication-efficiency. In …

Federated learning with compression: Unified analysis and sharp guarantees

F Haddadpour, MM Kamani… - International …, 2021 - proceedings.mlr.press
In federated learning, communication cost is often a critical bottleneck to scale up distributed
optimization algorithms to collaboratively learn a model from millions of devices with …

EF21: A new, simpler, theoretically better, and practically faster error feedback

P Richtárik, I Sokolov… - Advances in Neural …, 2021 - proceedings.neurips.cc
Error feedback (EF), also known as error compensation, is an immensely popular
convergence stabilization mechanism in the context of distributed training of supervised …

Gossipfl: A decentralized federated learning framework with sparsified and adaptive communication

Z Tang, S Shi, B Li, X Chu - IEEE Transactions on Parallel and …, 2022 - ieeexplore.ieee.org
Recently, federated learning (FL) techniques have enabled multiple users to train machine
learning models collaboratively without data sharing. However, existing FL algorithms suffer …

MARINA: Faster non-convex distributed learning with compression

E Gorbunov, KP Burlachenko, Z Li… - … on Machine Learning, 2021 - proceedings.mlr.press
We develop and analyze MARINA: a new communication efficient method for non-convex
distributed learning over heterogeneous datasets. MARINA employs a novel communication …

Exponential graph is provably efficient for decentralized deep training

B Ying, K Yuan, Y Chen, H Hu… - Advances in Neural …, 2021 - proceedings.neurips.cc
Decentralized SGD is an emerging training method for deep learning known for its much
less (thus faster) communication per iteration, which relaxes the averaging step in parallel …