Parameterized knowledge transfer for personalized federated learning

J Zhang, S Guo, X Ma, H Wang… - Advances in Neural …, 2021 - proceedings.neurips.cc
… Heterogeneous FL and Knowledge Distillation. To enable heterogeneous model architectures
… Another way of personalization is to use Knowledge Distillation (KD) in heterogeneous FL …

[HTML][HTML] Communication-efficient federated learning via knowledge distillation

C Wu, F Wu, L Lyu, Y Huang, X Xie - Nature communications, 2022 - nature.com
… This is because in FedKD there are multiple mentor models on different decentralized
clients for personalized learning and knowledge distillation. In addition, FedKD can save up to …

Metafed: Federated learning among federations with cyclic knowledge distillation for personalized healthcare

Y Chen, W Lu, X Qin, J Wang… - … Networks and Learning …, 2023 - ieeexplore.ieee.org
federation can be viewed as an independent individual. To implement MetaFed, we propose
a cyclic knowledge distillationknowledge accumulation stage and the personalization stage…

Personalized edge intelligence via federated self-knowledge distillation

H Jin, D Bai, D Yao, Y Dai, L Gu, C Yu… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
… In this paper, we study personalized federated learning in which our goal is to train models
… a novel Personalized Federated Learning (PFL) framework via self-knowledge distillation, …

Personalized decentralized federated learning with knowledge distillation

E Jeong, M Kountouris - ICC 2023-IEEE International …, 2023 - ieeexplore.ieee.org
… However, it is generally challenging to quantify similarity under limited knowledge about …
propose a personalized and fully decentralized FL algorithm, leveraging knowledge distillation

Quped: Quantized personalization via distillation with applications to federated learning

K Ozkara, N Singh, D Data… - Advances in Neural …, 2021 - proceedings.neurips.cc
… quantized and personalized models potentially having different dimensions/… of personalized
federated learning and learning quantized models; we also employ knowledge distillation

Spectral co-distillation for personalized federated learning

Z Chen, H Yang, T Quek… - Advances in Neural …, 2023 - proceedings.neurips.cc
… and personalized models, we can distill the knowledge of the … benefit the training of the
personalized models. By combining … knowledge distillation to bridge the training of generic and …

[HTML][HTML] A Personalized Federated Learning Method Based on Clustering and Knowledge Distillation

J Zhang, Y Shi - Electronics, 2024 - mdpi.com
personalized federal learning using knowledge distillation. Knowledge distillation requires
the migration of knowledge … However, in federated learning, the server cannot access the data …

Data-free knowledge distillation for heterogeneous federated learning

Z Zhu, J Hong, J Zhou - … conference on machine learning, 2021 - proceedings.mlr.press
… model using aggregated knowledge from heterogeneous users, … knowledge is not fully
utilized to guide local model learning, … a data-free knowledge distillation approach to address …

Fedcd: Personalized federated learning via collaborative distillation

S Ahmad, A Aral - … IEEE/ACM 15th International Conference on …, 2022 - ieeexplore.ieee.org
… Various challenges involved in this work include; 1) enabling knowledge distillation between
a number of clients without requiring a large pre-trained teacher network. 2) reducing the …