Fine-tuning global model via data-free knowledge distillation for non-iid federated learning

L Zhang, L Shen, L Ding, D Tao… - Proceedings of the …, 2022 - openaccess.thecvf.com
… To address this issue, we propose a data-free knowledge distillation method to fine-tune
the global model, so that the global model can preserve the knowledge in local models and …

Local-global knowledge distillation in heterogeneous federated learning with non-iid data

D Yao, W Pan, Y Dai, Y Wan, X Ding, H Jin… - arXiv preprint arXiv …, 2021 - arxiv.org
… • We introduce an ensemble-based knowledge distillation technique to … on non-IID data
distribution. • We provide a generalized and simple method for federated knowledge distillation, …

Learning critically: Selective self-distillation in federated learning on non-iid data

Y He, Y Chen, XD Yang, H Yu… - … on Big Data, 2022 - ieeexplore.ieee.org
… In the following, we start with the formulation of heterogeneous federated learning and
knowledge distillation. Then, we describe our critical learning strategies and the complete …

Communication-efficient federated learning on non-IID data using two-step knowledge distillation

H Wen, Y Wu, J Hu, Z Wang, H Duan… - IEEE Internet of Things …, 2023 - ieeexplore.ieee.org
non-IID data distribution across devices by building a local IID dataset on each device through
data … framework based on Two-step Knowledge Distillation, Fed2KD, has been proposed …

Communication-efficient federated data augmentation on non-iid data

H Wen, Y Wu, J Li, H Duan - Proceedings of the IEEE/CVF …, 2022 - openaccess.thecvf.com
… of Non-IID data, they compromise the privacy of the raw data. … data augmentation strategy,
FedDA, to address the Non-IIDdata reconstruction is regularized by knowledge distillation

Knowledge-aware federated active learning with non-iid data

YT Cao, Y Shi, B Yu, J Wang… - Proceedings of the IEEE …, 2023 - openaccess.thecvf.com
knowledge in local clients, ensuring the sampled data is … by limited data and non-IID data
distributions by compensating for … classes using knowledge distillation from the global model. …

Preservation of the global knowledge by not-true distillation in federated learning

G Lee, M Jeong, Y Shin, S Bae… - Advances in Neural …, 2022 - proceedings.neurips.cc
Knowledge Distillation Given a teacher model T and a student model S, knowledge
distillation … To understand how the non-IID data affects federated learning, we performed an …

FedTweet: Two-fold Knowledge Distillation for non-IID Federated Learning

Y Wang, W Wang, X Wang, H Zhang, X Wu… - Computers and Electrical …, 2024 - Elsevier
… that incorporate knowledge distillation into FL, we … data to facilitate the two-fold knowledge
distillation between local and global models to address the challenges posed by non-IID data

Communication-efficient on-device machine learning: Federated distillation and augmentation under non-iid private data

E Jeong, S Oh, H Kim, J Park, M Bennis… - arXiv preprint arXiv …, 2018 - arxiv.org
non-IID private data. For communication efficiency, we propose federated distillation (FD), a
distributed online knowledge distillation … Prior to operating FD, we rectify the non-IID training …

Distillation-based semi-supervised federated learning for communication-efficient collaborative training with non-iid private data

S Itahara, T Nishio, Y Koda, M Morikura… - IEEE Transactions …, 2021 - ieeexplore.ieee.org
… essential to the success of the knowledge distillation as supported in [26]. Hence, we can
conclude that the intensity of non-IID data distributions exactly affects the model performances. …