Knowledge selection and local updating optimization for federated knowledge distillation with heterogeneous models

D Wang, N Zhang, M Tao… - IEEE Journal of Selected …, 2022 - ieeexplore.ieee.org
Federated learning (FL) is a promising distributed learning paradigm in which multiple edge
devices (EDs) collaborate to train a shared model without exchanging privacy-sensitive raw …

Digital twin-assisted knowledge distillation framework for heterogeneous federated learning

X Wang, N Cheng, L Ma, R Sun, R Chai… - China …, 2023 - ieeexplore.ieee.org
In this paper, to deal with the heterogeneity in federated learning (FL) systems, a knowledge
distillation (KD) driven training framework for FL is proposed, where each user can select its …

UNIDEAL: Curriculum Knowledge Distillation Federated Learning

Y Yang, C Liu, X Cai, S Huang, H Lu… - ICASSP 2024-2024 …, 2024 - ieeexplore.ieee.org
Federated Learning (FL) has emerged as a promising approach to enable collaborative
learning among multiple clients while preserving data privacy. However, cross-domain FL …

Knowledge Distillation in Federated Learning: a Survey on Long Lasting Challenges and New Solutions

L Qin, T Zhu, W Zhou, PS Yu - arXiv preprint arXiv:2406.10861, 2024 - arxiv.org
Federated Learning (FL) is a distributed and privacy-preserving machine learning paradigm
that coordinates multiple clients to train a model while keeping the raw data localized …

FedSDD: Scalable and Diversity-enhanced Distillation for Model Aggregation in Federated Learning

HM Kwan, S Song - arXiv preprint arXiv:2312.17029, 2023 - arxiv.org
Recently, innovative model aggregation methods based on knowledge distillation (KD) have
been proposed for federated learning (FL). These methods not only improved the …

Fedhe: Heterogeneous models and communication-efficient federated learning

YH Chan, ECH Ngai - 2021 17th International Conference on …, 2021 - ieeexplore.ieee.org
Federated learning (FL) is able to manage edge devices to cooperatively train a model while
maintaining the training data local and private. One common assumption in FL is that all …

Efficient federated learning for AIoT applications using knowledge distillation

T Liu, J Xia, Z Ling, X Fu, S Yu… - IEEE Internet of Things …, 2022 - ieeexplore.ieee.org
As a promising distributed machine learning paradigm, federated learning (FL) trains a
central model with decentralized data without compromising user privacy, which makes it …

Collaborative Learning With Heterogeneous Local Models: A Rule-Based Knowledge Fusion Approach

Y Pang, H Zhang, JD Deng, L Peng… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Federated Learning (FL) has emerged as a promising collaborative learning paradigm that
enables to train machine learning models across decentralized devices, while keeping the …

Exploring the distributed knowledge congruence in proxy-data-free federated distillation

Z Wu, S Sun, Y Wang, M Liu, Q Pan, J Zhang… - ACM Transactions on …, 2024 - dl.acm.org
Federated learning (FL) is a privacy-preserving machine learning paradigm in which the
server periodically aggregates local model parameters from cli ents without assembling their …

[HTML][HTML] A Personalized Federated Learning Method Based on Clustering and Knowledge Distillation

J Zhang, Y Shi - Electronics, 2024 - mdpi.com
Federated learning (FL) is a distributed machine learning paradigm under privacy
preservation. However, data heterogeneity among clients leads to the shared global model …