Fedpop: A bayesian approach for personalised federated learning

N Kotelevskii, M Vono, A Durmus… - Advances in Neural …, 2022 - proceedings.neurips.cc
Personalised federated learning (FL) aims at collaboratively learning a machine learning
model tailored for each client. Albeit promising advances have been made in this direction …

Bayesian federated learning: A survey

L Cao, H Chen, X Fan, J Gama, YS Ong… - arXiv preprint arXiv …, 2023 - arxiv.org
Federated learning (FL) demonstrates its advantages in integrating distributed infrastructure,
communication, computing and learning in a privacy-preserving manner. However, the …

Compression with exact error distribution for federated learning

M Hegazy, R Leluc, CT Li, A Dieuleveut - arXiv preprint arXiv:2310.20682, 2023 - arxiv.org
Compression schemes have been extensively used in Federated Learning (FL) to reduce
the communication cost of distributed learning. While most approaches rely on a bounded …

Bayesian personalized federated learning with shared and personalized uncertainty representations

H Chen, H Liu, L Cao, T Zhang - arXiv preprint arXiv:2309.15499, 2023 - arxiv.org
Bayesian personalized federated learning (BPFL) addresses challenges in existing
personalized FL (PFL). BPFL aims to quantify the uncertainty and heterogeneity within and …

Wireless federated langevin monte carlo: Repurposing channel noise for bayesian sampling and privacy

D Liu, O Simeone - IEEE Transactions on Wireless …, 2022 - ieeexplore.ieee.org
Most works on federated learning (FL) focus on the most common frequentist formulation of
learning whereby the goal is minimizing the global empirical loss. Frequentist learning …

Forget-svgd: Particle-based bayesian federated unlearning

J Gong, J Kang, O Simeone… - 2022 IEEE Data Science …, 2022 - ieeexplore.ieee.org
Variational particle-based Bayesian learning methods have the advantage of not being
limited by the bias affecting more conventional parametric techniques. This paper proposes …

Parallel MCMC without embarrassing failures

DA De Souza, D Mesquita, S Kaski… - International …, 2022 - proceedings.mlr.press
Abstract Embarrassingly parallel Markov Chain Monte Carlo (MCMC) exploits parallel
computing to scale Bayesian inference to large datasets by using a two-step approach. First …

On convergence of federated averaging langevin dynamics

W Deng, Q Zhang, YA Ma, Z Song, G Lin - arXiv preprint arXiv:2112.05120, 2021 - arxiv.org
We propose a federated averaging Langevin algorithm (FA-LD) for uncertainty quantification
and mean predictions with distributed clients. In particular, we generalize beyond normal …

ELF: Federated langevin algorithms with primal, dual and bidirectional compression

A Karagulyan, P Richtárik - arXiv preprint arXiv:2303.04622, 2023 - arxiv.org
Federated sampling algorithms have recently gained great popularity in the community of
machine learning and statistics. This paper studies variants of such algorithms called Error …

Enhancing Low-Precision Sampling via Stochastic Gradient Hamiltonian Monte Carlo

Z Wang, Y Chen, Q Song, R Zhang - arXiv preprint arXiv:2310.16320, 2023 - arxiv.org
Low-precision training has emerged as a promising low-cost technique to enhance the
training efficiency of deep neural networks without sacrificing much accuracy. Its Bayesian …