… unlearning … knowledgedistillation process. Both these methods require the server to keep the history of the parameter updates from all the clients. Moreover, the knowledge-distillation …
… construction of federatedunlearning. In [132], the concept of knowledge editing throughout … process in the averaged model, knowledgedistillation is employed. This technique facilitates …
Y Zhao, P Wang, H Qi, J Huang, Z Wei… - IEEE Internet of Things …, 2023 - ieeexplore.ieee.org
… unlearning process into two steps: knowledge erasure and memory guidance. We first propose a novel knowledge … the model performance through knowledgedistillation. An advantage …
X Zhu, G Li, W Hu - Proceedings of the ACM web conference 2023, 2023 - dl.acm.org
… mutual knowledgedistillation to transfer local knowledge to … federated KG embedding learning and unlearning framework. In federated learning, we design mutual knowledgedistillation …
B Liu, Y Fang - arXiv preprint arXiv:2403.08554, 2024 - arxiv.org
… , Zhu et al [8] propose a mutual knowledgedistillation method to transfer local knowledge to the global and absorb knowledge to the global. Because of the requirements of privacy …
… ofthe-art federated machine unlearning models. KnowledgeDistillation is a federatedunlearning … from the model and leveraging the knowledge distillation method to restore the model’s …
… unlearning in federated recommendation systems, we propose an efficient unlearning method FRU (Federated … To the best of our knowledge, this is the first work to investigate machine …
… To the best of our knowledge, this is the first formal definition of exact federated … recent works have proposed approximate unlearning techniques based on knowledgedistillation [30], …