Communication-efficient decentralized machine learning over heterogeneous networks

P Zhou, Q Lin, D Loghin, BC Ooi… - 2021 IEEE 37th …, 2021 - ieeexplore.ieee.org
In the last few years, distributed machine learning has been usually executed over
heterogeneous networks such as a local area network within a multi-tenant cluster or a wide …

Exponential graph is provably efficient for decentralized deep training

B Ying, K Yuan, Y Chen, H Hu… - Advances in Neural …, 2021 - proceedings.neurips.cc
Decentralized SGD is an emerging training method for deep learning known for its much
less (thus faster) communication per iteration, which relaxes the averaging step in parallel …

Communication-efficient decentralized learning with sparsification and adaptive peer selection

Z Tang, S Shi, X Chu - 2020 IEEE 40th International …, 2020 - ieeexplore.ieee.org
The increasing size of machine learning models, especially deep neural network models,
can improve the model generalization capability. However, large models require more …

Beyond exponential graph: Communication-efficient topologies for decentralized learning via finite-time convergence

Y Takezawa, R Sato, H Bao, K Niwa… - Advances in Neural …, 2024 - proceedings.neurips.cc
Decentralized learning has recently been attracting increasing attention for its applications
in parallel computation and privacy preservation. Many recent studies stated that the …

L-FGADMM: Layer-wise federated group ADMM for communication efficient decentralized deep learning

A Elgabli, J Park, S Ahmed… - 2020 IEEE Wireless …, 2020 - ieeexplore.ieee.org
This article proposes a communication-efficient decentralized deep learning algorithm,
coined layer-wise federated group ADMM (L-FGADMM). To minimize an empirical risk …

Error-compensated sparsification for communication-efficient decentralized training in edge environment

H Wang, S Guo, Z Qu, R Li, Z Liu - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
Communication has been considered as a major bottleneck in large-scale decentralized
training systems since participating nodes iteratively exchange large amounts of …

Cross-gradient aggregation for decentralized learning from non-iid data

Y Esfandiari, SY Tan, Z Jiang, A Balu… - International …, 2021 - proceedings.mlr.press
Decentralized learning enables a group of collaborative agents to learn models using a
distributed dataset without the need for a central parameter server. Recently, decentralized …

Communication-efficient distributed deep learning: A comprehensive survey

Z Tang, S Shi, W Wang, B Li, X Chu - arXiv preprint arXiv:2003.06307, 2020 - arxiv.org
Distributed deep learning (DL) has become prevalent in recent years to reduce training time
by leveraging multiple computing devices (eg, GPUs/TPUs) due to larger models and …

Data-heterogeneity-aware mixing for decentralized learning

Y Dandi, A Koloskova, M Jaggi, SU Stich - arXiv preprint arXiv:2204.06477, 2022 - arxiv.org
Decentralized learning provides an effective framework to train machine learning models
with data distributed over arbitrary communication graphs. However, most existing …

Cross-feature Contrastive Loss for Decentralized Deep Learning on Heterogeneous Data

SA Aketi, K Roy - Proceedings of the IEEE/CVF Winter …, 2024 - openaccess.thecvf.com
The current state-of-the-art decentralized learning algorithms mostly assume the data
distribution to be Independent and Identically Distributed (IID). However, in practical …