Achieving linear speedup in asynchronous federated learning with heterogeneous clients

X Wang, Z Li, S Jin, J Zhang - IEEE Transactions on Mobile …, 2024 - ieeexplore.ieee.org
Federated learning (FL) is an emerging distributed training paradigm that aims to learn a
common global model without exchanging or transferring the data that are stored locally at …

Differentially Private Online Federated Learning with Correlated Noise

J Zhang, L Zhu, M Johansson - arXiv preprint arXiv:2403.16542, 2024 - arxiv.org
We propose a novel differentially private algorithm for online federated learning that
employs temporally correlated noise to improve the utility while ensuring the privacy of the …

Locally Differentially Private Online Federated Learning With Correlated Noise

J Zhang, L Zhu, D Fay, M Johansson - arXiv preprint arXiv:2411.18752, 2024 - arxiv.org
We introduce a locally differentially private (LDP) algorithm for online federated learning that
employs temporally correlated noise to improve utility while preserving privacy. To address …

Dual-Delayed Asynchronous SGD for Arbitrarily Heterogeneous Data

X Wang, Y Sun, HT Wai, J Zhang - arXiv preprint arXiv:2405.16966, 2024 - arxiv.org
We consider the distributed learning problem with data dispersed across multiple workers
under the orchestration of a central server. Asynchronous Stochastic Gradient Descent …

Incremental Aggregated Asynchronous SGD for Arbitrarily Heterogeneous Data

AH DATA - openreview.net
We consider the distributed learning problem with data dispersed across multiple workers
under the orchestration of a central server. Asynchronous Stochastic Gradient Descent …