In this paper, we propose a communication-efficiently decentralized machine learning framework that solves a consensus optimization problem defined over a network of inter …
In this article, we propose a communication-efficient decentralized machine learning (ML) algorithm, coined quantized group ADMM (Q-GADMM). To reduce the number of …
The challenge of communication-efficient distributed optimization has attracted attention in recent years. In this paper, a communication efficient algorithm, called ordering-based …
In this paper, we propose a fast, privacy-aware, and communication-efficient decentralized framework to solve the distributed machine learning (DML) problem. The proposed …
X Zhang, J Liu, Z Zhu, ES Bentley - IEEE INFOCOM 2021-IEEE …, 2021 - ieeexplore.ieee.org
Network-consensus-based decentralized learning optimization algorithms have attracted a significant amount of attention in recent years due to their rapidly growing applications …
Y Liu, G Wu, Z Tian, Q Ling - IEEE Transactions on Neural …, 2021 - ieeexplore.ieee.org
In distributed learning and optimization, a network of multiple computing units coordinates to solve a large-scale problem. This article focuses on dynamic optimization over a …
The present work introduces the hybrid consensus alternating direction method of multipliers (H-CADMM), a novel framework for optimization over networks which unifies existing …
Consensus-based decentralized stochastic gradient descent (D-SGD) is a widely adopted algorithm for decentralized training of machine learning models across networked agents. A …
CC Chiu, X Zhang, T He, S Wang… - IEEE Journal on …, 2023 - ieeexplore.ieee.org
We consider the problem of training a given machine learning model by decentralized parallel stochastic gradient descent over training data distributed across multiple nodes …