Federated unlearning with knowledge distillation

C Wu, S Zhu, P Mitra - arXiv preprint arXiv:2201.09441, 2022 - arxiv.org
federated unlearning method to eliminate a client’s contribution by subtracting the accumulated
historical updates from the model and leveraging the knowledge distillationunlearning

Federated unlearning: How to efficiently erase a client in fl?

A Halimi, S Kadhe, A Rawat, N Baracaldo - arXiv preprint arXiv …, 2022 - arxiv.org
unlearningknowledge distillation process. Both these methods require the server to
keep the history of the parameter updates from all the clients. Moreover, the knowledge-distillation

A survey on federated unlearning: Challenges, methods, and future directions

Z Liu, Y Jiang, J Shen, M Peng, KY Lam… - ACM Computing …, 2023 - dl.acm.org
… construction of federated unlearning. In [132], the concept of knowledge editing throughout
… process in the averaged model, knowledge distillation is employed. This technique facilitates …

Federated unlearning with momentum degradation

Y Zhao, P Wang, H Qi, J Huang, Z Wei… - IEEE Internet of Things …, 2023 - ieeexplore.ieee.org
unlearning process into two steps: knowledge erasure and memory guidance. We first propose
a novel knowledge … the model performance through knowledge distillation. An advantage …

Heterogeneous federated knowledge graph embedding learning and unlearning

X Zhu, G Li, W Hu - Proceedings of the ACM web conference 2023, 2023 - dl.acm.org
… mutual knowledge distillation to transfer local knowledge to … federated KG embedding
learning and unlearning framework. In federated learning, we design mutual knowledge distillation

SoK: Challenges and Opportunities in Federated Unlearning

H Jeong, S Ma, A Houmansadr - arXiv preprint arXiv:2403.02437, 2024 - arxiv.org
… background knowledge relevant to Federated Unlearning (… to Federated Learning (FL)
and Machine Unlearning (MU), … Takeaways on knowledge distillation: Knowledge distillation

Federated Knowledge Graph Unlearning via Diffusion Model

B Liu, Y Fang - arXiv preprint arXiv:2403.08554, 2024 - arxiv.org
… , Zhu et al [8] propose a mutual knowledge distillation method to transfer local knowledge
to the global and absorb knowledge to the global. Because of the requirements of privacy …

Fast federated machine unlearning with nonlinear functional theory

T Che, Y Zhou, Z Zhang, L Lyu, J Liu… - International …, 2023 - proceedings.mlr.press
… ofthe-art federated machine unlearning models. Knowledge Distillation is a federated unlearning
… from the model and leveraging the knowledge distillation method to restore the model’s …

Federated unlearning for on-device recommendation

W Yuan, H Yin, F Wu, S Zhang, T He… - Proceedings of the …, 2023 - dl.acm.org
unlearning in federated recommendation systems, we propose an efficient unlearning method
FRU (Federated … To the best of our knowledge, this is the first work to investigate machine …

Communication efficient and provable federated unlearning

Y Tao, CL Wang, M Pan, D Yu, X Cheng… - arXiv preprint arXiv …, 2024 - arxiv.org
… To the best of our knowledge, this is the first formal definition of exact federated … recent
works have proposed approximate unlearning techniques based on knowledge distillation [30], …