Forget-svgd: Particle-based bayesian federated unlearning

J Gong, J Kang, O Simeone… - 2022 IEEE Data Science …, 2022 - ieeexplore.ieee.org
2022 IEEE Data Science and Learning Workshop (DSLW), 2022ieeexplore.ieee.org
Variational particle-based Bayesian learning methods have the advantage of not being
limited by the bias affecting more conventional parametric techniques. This paper proposes
to leverage the flexibility of non-parametric Bayesian approximate inference to develop a
novel Bayesian federated unlearning method, referred to as Forget-Stein Variational
Gradient Descent (Forget-SVGD). Forget-SVGD builds on SVGD–a particle-based
approximate Bayesian inference scheme using gradient-based deterministic updates–and …
Variational particle-based Bayesian learning methods have the advantage of not being limited by the bias affecting more conventional parametric techniques. This paper proposes to leverage the flexibility of non-parametric Bayesian approximate inference to develop a novel Bayesian federated unlearning method, referred to as Forget-Stein Variational Gradient Descent (Forget-SVGD). Forget-SVGD builds on SVGD – a particle-based approximate Bayesian inference scheme using gradient-based deterministic updates – and on its distributed (federated) extension known as Distributed SVGD (DSVGD). Upon the completion of federated learning, as one or more participating agents request for their data to be “forgotten”, Forget-SVGD carries out local SVGD updates at the agents whose data need to be “unlearned”, which are interleaved with communication rounds with a parameter server. The proposed method is validated via performance comparisons with non-parametric schemes that train from scratch by excluding data to be forgotten, as well as with existing parametric Bayesian unlearning methods.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果