[PDF][PDF] Class-wise adaptive self distillation for heterogeneous federated learning

Y He, Y Chen, X Yang, Y Zhang… - Proceedings of the 36th …, 2022 - federated-learning.org
The heterogeneity of data distributions among clients (non-IID) has been identified as one of
the key challenges in federated learning. In the local training phase, each client model …

Local-global knowledge distillation in heterogeneous federated learning with non-iid data

D Yao, W Pan, Y Dai, Y Wan, X Ding, H Jin… - arXiv preprint arXiv …, 2021 - arxiv.org
Federated learning enables multiple clients to collaboratively learn a global model by
periodically aggregating the clients' models without transferring the local data. However, due …

Learning critically: Selective self-distillation in federated learning on non-iid data

Y He, Y Chen, XD Yang, H Yu… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
Federated learning (FL) enables multiple clients to collaboratively train a global model while
keeping local data decentralized. Data heterogeneity (non-IID) across clients has imposed …

[PDF][PDF] Global knowledge distillation in federated learning

W Pan, L Sun - arXiv preprint arXiv:2107.00051, 2021 - ask.qcloudimg.com
Abstract Knowledge distillation has caught a lot of attention in Federated Learning (FL)
recently. It has the advantage for FL to train on heterogeneous clients which have different …

FedGKD: Towards Heterogeneous Federated Learning via Global Knowledge Distillation

D Yao, W Pan, Y Dai, Y Wan, X Ding… - IEEE Transactions …, 2023 - ieeexplore.ieee.org
Federated learning, as one enabling technology of edge intelligence, has gained substantial
attention due to its efficacy in training deep learning models without data privacy and …

FedTweet: Two-fold Knowledge Distillation for non-IID Federated Learning

Y Wang, W Wang, X Wang, H Zhang, X Wu… - Computers and Electrical …, 2024 - Elsevier
Federated Learning (FL) is a distributed learning approach that allows each client to retain
its original data locally and share only the parameters of the local updates with the server …

Meta knowledge condensation for federated learning

P Liu, X Yu, JT Zhou - arXiv preprint arXiv:2209.14851, 2022 - arxiv.org
Existing federated learning paradigms usually extensively exchange distributed models at a
central solver to achieve a more powerful model. However, this would incur severe …

FedRAD: Heterogeneous Federated Learning via Relational Adaptive Distillation

J Tang, X Ding, D Hu, B Guo, Y Shen, P Ma, Y Jiang - Sensors, 2023 - mdpi.com
As the development of the Internet of Things (IoT) continues, Federated Learning (FL) is
gaining popularity as a distributed machine learning framework that does not compromise …

Data-free knowledge distillation for heterogeneous federated learning

Z Zhu, J Hong, J Zhou - International conference on machine …, 2021 - proceedings.mlr.press
Federated Learning (FL) is a decentralized machine-learning paradigm, in which a global
server iteratively averages the model parameters of local users without accessing their data …

Global prototype distillation for heterogeneous federated learning

S Wu, J Chen, X Nie, Y Wang, X Zhou, L Lu, W Peng… - Scientific Reports, 2024 - nature.com
Federated learning is a distributed machine learning paradigm where the goal is to
collaboratively train a high quality global model while private training data remains local …