Escaping Saddle Points in Heterogeneous Federated Learning via Distributed SGD with Communication Compression

S Chen, Z Li, Y Chi - International Conference on Artificial …, 2024 - proceedings.mlr.press
We consider the problem of finding second-order stationary points in the optimization of
heterogeneous federated learning (FL). Previous works in FL mostly focus on first-order …

Sparse training for federated learning with regularized error correction

R Greidi, K Cohen - IEEE Journal of Selected Topics in Signal …, 2024 - ieeexplore.ieee.org
Federated Learning (FL) is an emerging paradigm that allows for decentralized machine
learning (ML), where multiple models are collaboratively trained in a privacy-preserving …

Smoothed Gradient Clipping and Error Feedback for Distributed Optimization under Heavy-Tailed Noise

S Yu, D Jakovetic, S Kar - arXiv preprint arXiv:2310.16920, 2023 - arxiv.org
Motivated by understanding and analysis of large-scale machine learning under heavy-
tailed gradient noise, we study distributed optimization with smoothed gradient clipping, ie …

The Privacy Power of Correlated Noise in Decentralized Learning

Y Allouah, A Koloskova, AE Firdoussi, M Jaggi… - arXiv preprint arXiv …, 2024 - arxiv.org
Decentralized learning is appealing as it enables the scalable usage of large amounts of
distributed data and resources (without resorting to any central entity), while promoting …

Local Differential Privacy for Decentralized Online Stochastic Optimization with Guaranteed Optimality and Convergence Speed

Z Chen, Y Wang - IEEE Transactions on Automatic Control, 2024 - ieeexplore.ieee.org
The increasing usage of streaming data has raised significant privacy concerns in
decentralized optimization and learning applications. To address this issue, differential …

Communication-Efficient Federated Learning via Sparse Training with Regularized Error Correction

R Greidi, K Cohen - 2024 60th Annual Allerton Conference on …, 2024 - ieeexplore.ieee.org
Federated Learning (FL) is an emerging paradigm that allows for decentralized machine
learning (ML), where multiple models are collaboratively trained in a privacy-preserving …

Sensitivity-Aware Differentially Private Decentralized Learning with Adaptive Noise

Z Zhu, Y Huang, X Wang, J Xu - openreview.net
Most existing decentralized learning methods with differential privacy (DP) employ fixed-
level Gaussian noise during training, regardless of gradient convergence, which …