作者
Zhiwei Tang, Yanmeng Wang, Tsung-Hui Chang
发表日期
2024/3/24
期刊
Proceedings of the AAAI Conference on Artificial Intelligence
卷号
38
期号
14
页码范围
15301-15309
简介
Federated Learning (FL) is a promising privacy-preserving distributed learning paradigm but suffers from high communication cost when training large-scale machine learning models. Sign-based methods, such as SignSGD \citep{bernstein2018signsgd}, have been proposed as a biased gradient compression technique for reducing the communication cost. However, sign-based algorithms could diverge under heterogeneous data, which thus motivated the development of advanced techniques, such as the error-feedback method and stochastic sign-based compression, to fix this issue. Nevertheless, these methods still suffer from slower convergence rates, and none of them allows multiple local SGD updates like FedAvg \citep{mcmahan2017communication}. In this paper, we propose a novel noisy perturbation scheme with a general symmetric noise distribution for sign-based compression, which not only allows one to flexibly control the bias-variance tradeoff for the compressed gradient, but also provides a unified viewpoint to existing stochastic sign-based methods. More importantly, the proposed scheme enables the development of the very first sign-based FedAvg algorithm (-SignFedAvg) to accelerate the convergence. Theoretically, we show that -SignFedAvg achieves a faster convergence rate than existing sign-based methods and, under the uniformly distributed noise, can enjoy the same convergence rate as its uncompressed counterpart. Extensive experiments are conducted to demonstrate that the -SignFedAvg can achieve competitive empirical performance on real datasets and outperforms existing schemes.
引用总数
学术搜索中的文章
Z Tang, Y Wang, TH Chang - Proceedings of the AAAI Conference on Artificial …, 2024
Z Tang, Y Wang, TH Chang - Workshop on Federated Learning: Recent Advances …, 2022