Communication efficient distributed learning with censored, quantized, and generalized group ADMM

CB Issaid, A Elgabli, J Park, M Bennis… - arXiv preprint arXiv …, 2020 - arxiv.org
In this paper, we propose a communication-efficiently decentralized machine learning
framework that solves a consensus optimization problem defined over a network of inter …

Communication efficient decentralized learning over bipartite graphs

CB Issaid, A Elgabli, J Park, M Bennis… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
In this paper, we propose a communication-efficiently decentralized machine learning
framework that solves a consensus optimization problem defined over a network of inter …

Communication efficient framework for decentralized machine learning

A Elgabli, J Park, AS Bedi, M Bennis… - 2020 54th Annual …, 2020 - ieeexplore.ieee.org
In this paper, we propose a fast, privacy-aware, and communication-efficient decentralized
framework to solve the distributed machine learning (DML) problem. The proposed …

Q-GADMM: Quantized group ADMM for communication efficient decentralized machine learning

A Elgabli, J Park, AS Bedi, CB Issaid… - IEEE Transactions …, 2020 - ieeexplore.ieee.org
In this article, we propose a communication-efficient decentralized machine learning (ML)
algorithm, coined quantized group ADMM (Q-GADMM). To reduce the number of …

GADMM: Fast and communication efficient framework for distributed machine learning

A Elgabli, J Park, AS Bedi, M Bennis… - Journal of Machine …, 2020 - jmlr.org
When the data is distributed across multiple servers, lowering the communication cost
between the servers (or workers) while solving the distributed learning problem is an …

Low sample and communication complexities in decentralized learning: A triple hybrid approach

X Zhang, J Liu, Z Zhu, ES Bentley - IEEE INFOCOM 2021-IEEE …, 2021 - ieeexplore.ieee.org
Network-consensus-based decentralized learning optimization algorithms have attracted a
significant amount of attention in recent years due to their rapidly growing applications …

Sparse-SignSGD with majority vote for communication-efficient distributed learning

C Park, N Lee - arXiv preprint arXiv:2302.07475, 2023 - arxiv.org
The training efficiency of complex deep learning models can be significantly improved
through the use of distributed optimization. However, this process is often hindered by a …

Fast decentralized learning via hybrid consensus ADMM

M Ma, AN Nikolakopoulos… - 2018 IEEE International …, 2018 - ieeexplore.ieee.org
The Alternating Directions Methods of Multipliers (ADMM) has witnessed a resurgence of
interest over the past few years fueled by the ever increasing demand for scalable …

L-FGADMM: Layer-wise federated group ADMM for communication efficient decentralized deep learning

A Elgabli, J Park, S Ahmed… - 2020 IEEE Wireless …, 2020 - ieeexplore.ieee.org
This article proposes a communication-efficient decentralized deep learning algorithm,
coined layer-wise federated group ADMM (L-FGADMM). To minimize an empirical risk …

DQ-SGD: Dynamic quantization in SGD for communication-efficient distributed learning

G Yan, SL Huang, T Lan, L Song - 2021 IEEE 18th …, 2021 - ieeexplore.ieee.org
Gradient quantization is an emerging technique in reducing communication costs in
distributed learning. Existing gradient quantization algorithms often rely on engineering …