In this paper, we propose a communication-efficiently decentralized machine learning framework that solves a consensus optimization problem defined over a network of inter …
In this paper, we propose a fast, privacy-aware, and communication-efficient decentralized framework to solve the distributed machine learning (DML) problem. The proposed …
In this article, we propose a communication-efficient decentralized machine learning (ML) algorithm, coined quantized group ADMM (Q-GADMM). To reduce the number of …
When the data is distributed across multiple servers, lowering the communication cost between the servers (or workers) while solving the distributed learning problem is an …
X Zhang, J Liu, Z Zhu, ES Bentley - IEEE INFOCOM 2021-IEEE …, 2021 - ieeexplore.ieee.org
Network-consensus-based decentralized learning optimization algorithms have attracted a significant amount of attention in recent years due to their rapidly growing applications …
C Park, N Lee - arXiv preprint arXiv:2302.07475, 2023 - arxiv.org
The training efficiency of complex deep learning models can be significantly improved through the use of distributed optimization. However, this process is often hindered by a …
The Alternating Directions Methods of Multipliers (ADMM) has witnessed a resurgence of interest over the past few years fueled by the ever increasing demand for scalable …
This article proposes a communication-efficient decentralized deep learning algorithm, coined layer-wise federated group ADMM (L-FGADMM). To minimize an empirical risk …
G Yan, SL Huang, T Lan, L Song - 2021 IEEE 18th …, 2021 - ieeexplore.ieee.org
Gradient quantization is an emerging technique in reducing communication costs in distributed learning. Existing gradient quantization algorithms often rely on engineering …