Communication-efficient distributed learning: An overview

X Cao, T Başar, S Diggavi, YC Eldar… - IEEE journal on …, 2023 - ieeexplore.ieee.org
Distributed learning is envisioned as the bedrock of next-generation intelligent networks,
where intelligent agents, such as mobile devices, robots, and sensors, exchange information …

Ashl: An adaptive multi-stage distributed deep learning training scheme for heterogeneous environments

Z Shen, Q Tang, T Zhou, Y Zhang, Z Jia… - IEEE Transactions …, 2023 - ieeexplore.ieee.org
With the increment of data sets and models sizes, distributed deep learning has been
proposed to accelerate training and improve the accuracy of DNN models. The parameter …

L-FGADMM: Layer-wise federated group ADMM for communication efficient decentralized deep learning

A Elgabli, J Park, S Ahmed… - 2020 IEEE Wireless …, 2020 - ieeexplore.ieee.org
This article proposes a communication-efficient decentralized deep learning algorithm,
coined layer-wise federated group ADMM (L-FGADMM). To minimize an empirical risk …

: Accelerating Asynchronous Communication in Decentralized Deep Learning

A Nabli, E Belilovsky, E Oyallon - Advances in Neural …, 2024 - proceedings.neurips.cc
Abstract Distributed training of Deep Learning models has been critical to many recent
successes in the field. Current standard methods primarily rely on synchronous centralized …

Distributed Analytics For Big Data: A Survey

F Berloco, V Bevilacqua, S Colucci - Neurocomputing, 2024 - Elsevier
In recent years, a constant and fast information growing has characterized digital
applications in the majority of real-life scenarios. Thus, a new information asset, namely Big …

Distributed machine learning: Foundations, trends, and practices

TY Liu, W Chen, T Wang - … of the 26th International Conference on World …, 2017 - dl.acm.org
In recent years, artificial intelligence has achieved great success in many important
applications. Both novel machine learning algorithms (eg, deep neural networks), and their …

Model accuracy and runtime tradeoff in distributed deep learning: A systematic study

S Gupta, W Zhang, F Wang - 2016 IEEE 16th International …, 2016 - ieeexplore.ieee.org
Deep learning with a large number of parametersrequires distributed training, where model
accuracy and runtimeare two important factors to be considered. However, there hasbeen …

DIMAT: Decentralized Iterative Merging-And-Training for Deep Learning Models

N Saadati, M Pham, N Saleem… - Proceedings of the …, 2024 - openaccess.thecvf.com
Recent advances in decentralized deep learning algorithms have demonstrated cutting-
edge performance on various tasks with large pre-trained models. However a pivotal …

[PDF][PDF] Distributed reduced convolution neural networks

M Alajanbi, D Malerba, H Liu - … Journal of Big …, 2021 - journals.mesopotamian.press
Abstract A Convolution Neural Network (CNN) is a popular tool in the domains of pattern
recognition and machine learning. The performance of KCNN (kernel-based convolutional …