Communication efficient decentralized learning over bipartite graphs

CB Issaid, A Elgabli, J Park, M Bennis… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
In this paper, we propose a communication-efficiently decentralized machine learning
framework that solves a consensus optimization problem defined over a network of inter …

Communication efficient distributed learning with censored, quantized, and generalized group ADMM

CB Issaid, A Elgabli, J Park, M Bennis… - arXiv preprint arXiv …, 2020 - arxiv.org
In this paper, we propose a communication-efficiently decentralized machine learning
framework that solves a consensus optimization problem defined over a network of inter …

Q-GADMM: Quantized group ADMM for communication efficient decentralized machine learning

A Elgabli, J Park, AS Bedi, CB Issaid… - IEEE Transactions …, 2020 - ieeexplore.ieee.org
In this article, we propose a communication-efficient decentralized machine learning (ML)
algorithm, coined quantized group ADMM (Q-GADMM). To reduce the number of …

Communication efficient federated learning via ordered ADMM in a fully decentralized setting

Y Chen, RS Blum, BM Sadler - 2022 56th Annual Conference …, 2022 - ieeexplore.ieee.org
The challenge of communication-efficient distributed optimization has attracted attention in
recent years. In this paper, a communication efficient algorithm, called ordering-based …

Communication efficient framework for decentralized machine learning

A Elgabli, J Park, AS Bedi, M Bennis… - 2020 54th Annual …, 2020 - ieeexplore.ieee.org
In this paper, we propose a fast, privacy-aware, and communication-efficient decentralized
framework to solve the distributed machine learning (DML) problem. The proposed …

Low sample and communication complexities in decentralized learning: A triple hybrid approach

X Zhang, J Liu, Z Zhu, ES Bentley - IEEE INFOCOM 2021-IEEE …, 2021 - ieeexplore.ieee.org
Network-consensus-based decentralized learning optimization algorithms have attracted a
significant amount of attention in recent years due to their rapidly growing applications …

DQC-ADMM: Decentralized dynamic ADMM with quantized and censored communications

Y Liu, G Wu, Z Tian, Q Ling - IEEE Transactions on Neural …, 2021 - ieeexplore.ieee.org
In distributed learning and optimization, a network of multiple computing units coordinates to
solve a large-scale problem. This article focuses on dynamic optimization over a …

[HTML][HTML] Hybrid ADMM: a unifying and fast approach to decentralized optimization

M Ma, AN Nikolakopoulos, GB Giannakis - EURASIP Journal on …, 2018 - Springer
The present work introduces the hybrid consensus alternating direction method of multipliers
(H-CADMM), a novel framework for optimization over networks which unifies existing …

Faster Convergence with Less Communication: Broadcast-Based Subgraph Sampling for Decentralized Learning over Wireless Networks

DP Herrera, Z Chen, EG Larsson - arXiv preprint arXiv:2401.13779, 2024 - arxiv.org
Consensus-based decentralized stochastic gradient descent (D-SGD) is a widely adopted
algorithm for decentralized training of machine learning models across networked agents. A …

Laplacian matrix sampling for communication-efficient decentralized learning

CC Chiu, X Zhang, T He, S Wang… - IEEE Journal on …, 2023 - ieeexplore.ieee.org
We consider the problem of training a given machine learning model by decentralized
parallel stochastic gradient descent over training data distributed across multiple nodes …