C Wu, F Wu, L Lyu, Y Huang, X Xie - Nature communications, 2022 - nature.com
… This is because in FedKD there are multiple mentor models on different decentralized clients for personalizedlearning and knowledgedistillation. In addition, FedKD can save up to …
Y Chen, W Lu, X Qin, J Wang… - … Networks and Learning …, 2023 - ieeexplore.ieee.org
… federation can be viewed as an independent individual. To implement MetaFed, we propose a cyclic knowledgedistillation … knowledge accumulation stage and the personalization stage…
H Jin, D Bai, D Yao, Y Dai, L Gu, C Yu… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
… In this paper, we study personalizedfederatedlearning in which our goal is to train models … a novel PersonalizedFederatedLearning (PFL) framework via self-knowledgedistillation, …
E Jeong, M Kountouris - ICC 2023-IEEE International …, 2023 - ieeexplore.ieee.org
… However, it is generally challenging to quantify similarity under limited knowledge about … propose a personalized and fully decentralized FL algorithm, leveraging knowledgedistillation …
… quantized and personalized models potentially having different dimensions/… of personalized federatedlearning and learning quantized models; we also employ knowledgedistillation …
Z Chen, H Yang, T Quek… - Advances in Neural …, 2023 - proceedings.neurips.cc
… and personalized models, we can distill the knowledge of the … benefit the training of the personalized models. By combining … knowledgedistillation to bridge the training of generic and …
… personalizedfederallearning using knowledgedistillation. Knowledgedistillation requires the migration of knowledge … However, in federatedlearning, the server cannot access the data …
Z Zhu, J Hong, J Zhou - … conference on machine learning, 2021 - proceedings.mlr.press
… model using aggregated knowledge from heterogeneous users, … knowledge is not fully utilized to guide local model learning, … a data-free knowledgedistillation approach to address …
S Ahmad, A Aral - … IEEE/ACM 15th International Conference on …, 2022 - ieeexplore.ieee.org
… Various challenges involved in this work include; 1) enabling knowledgedistillation between a number of clients without requiring a large pre-trained teacher network. 2) reducing the …