[HTML][HTML] Decentralized learning works: An empirical comparison of gossip learning and federated learning

I Hegedűs, G Danner, M Jelasity - Journal of Parallel and Distributed …, 2021 - Elsevier
Abstract Machine learning over distributed data stored by many clients has important
applications in use cases where data privacy is a key concern or central data storage is not …

Gossip learning as a decentralized alternative to federated learning

I Hegedűs, G Danner, M Jelasity - … Interoperable Systems: 19th IFIP WG 6.1 …, 2019 - Springer
Federated learning is a distributed machine learning approach for computing models over
data collected by edge devices. Most importantly, the data itself is not collected centrally, but …

Decentralized federated learning: A segmented gossip approach

C Hu, J Jiang, Z Wang - arXiv preprint arXiv:1908.07782, 2019 - arxiv.org
The emerging concern about data privacy and security has motivated the proposal of
federated learning, which allows nodes to only synchronize the locally-trained models …

Gossip learning with linear models on fully distributed data

R Ormándi, I Hegedűs, M Jelasity - … and Computation: Practice …, 2013 - Wiley Online Library
Machine learning over fully distributed data poses an important problem in peer‐to‐peer
applications. In this model, we have one data record at each network node but without the …

A survey on federated learning

C Zhang, Y Xie, H Bai, B Yu, W Li, Y Gao - Knowledge-Based Systems, 2021 - Elsevier
Federated learning is a set-up in which multiple clients collaborate to solve machine
learning problems, which is under the coordination of a central aggregator. This setting also …

Bacombo—bandwidth-aware decentralized federated learning

J Jiang, L Hu, C Hu, J Liu, Z Wang - Electronics, 2020 - mdpi.com
The emerging concern about data privacy and security has motivated the proposal of
federated learning. Federated learning allows computing nodes to only synchronize the …

Gossipfl: A decentralized federated learning framework with sparsified and adaptive communication

Z Tang, S Shi, B Li, X Chu - IEEE Transactions on Parallel and …, 2022 - ieeexplore.ieee.org
Recently, federated learning (FL) techniques have enabled multiple users to train machine
learning models collaboratively without data sharing. However, existing FL algorithms suffer …

Fedpaq: A communication-efficient federated learning method with periodic averaging and quantization

A Reisizadeh, A Mokhtari, H Hassani… - International …, 2020 - proceedings.mlr.press
Federated learning is a distributed framework according to which a model is trained over a
set of devices, while keeping data localized. This framework faces several systems-oriented …

Accelerating gossip SGD with periodic global averaging

Y Chen, K Yuan, Y Zhang, P Pan… - … on Machine Learning, 2021 - proceedings.mlr.press
Communication overhead hinders the scalability of large-scale distributed training. Gossip
SGD, where each node averages only with its neighbors, is more communication-efficient …

Federated learning: A survey on enabling technologies, protocols, and applications

M Aledhari, R Razzak, RM Parizi, F Saeed - IEEE Access, 2020 - ieeexplore.ieee.org
This paper provides a comprehensive study of Federated Learning (FL) with an emphasis
on enabling software and hardware platforms, protocols, real-life applications and use …