Digital twin-assisted knowledge distillation framework for heterogeneous federated learning

X Wang, N Cheng, L Ma, R Sun, R Chai… - China …, 2023 - ieeexplore.ieee.org
In this paper, to deal with the heterogeneity in federated learning (FL) systems, a knowledge
distillation (KD) driven training framework for FL is proposed, where each user can select its …

Knowledge selection and local updating optimization for federated knowledge distillation with heterogeneous models

D Wang, N Zhang, M Tao… - IEEE Journal of Selected …, 2022 - ieeexplore.ieee.org
Federated learning (FL) is a promising distributed learning paradigm in which multiple edge
devices (EDs) collaborate to train a shared model without exchanging privacy-sensitive raw …

Handling Data Heterogeneity for IoT Devices in Federated Learning: A Knowledge Fusion Approach

X Zhou, X Lei, C Yang, Y Shi… - IEEE Internet of Things …, 2023 - ieeexplore.ieee.org
Federated learning (FL) supports distributed training of a global machine learning model
across multiple Internet of Things (IoT) devices with the help of a central server. However …

A hierarchical knowledge transfer framework for heterogeneous federated learning

Y Deng, J Ren, C Tang, F Lyu, Y Liu… - IEEE INFOCOM 2023 …, 2023 - ieeexplore.ieee.org
Federated learning (FL) enables distributed clients to collaboratively learn a shared model
while keeping their raw data private. To mitigate the system heterogeneity issues of FL and …

[PDF][PDF] Global knowledge distillation in federated learning

W Pan, L Sun - arXiv preprint arXiv:2107.00051, 2021 - ask.qcloudimg.com
Abstract Knowledge distillation has caught a lot of attention in Federated Learning (FL)
recently. It has the advantage for FL to train on heterogeneous clients which have different …

Layer-wise knowledge distillation for cross-device federated learning

HQ Le, LX Nguyen, SB Park… - 2023 International …, 2023 - ieeexplore.ieee.org
Federated Learning (FL) has been proposed as a decentralized machine learning system
where multiple clients jointly train the model without sharing private data. In FL, the statistical …

Data-free knowledge distillation for heterogeneous federated learning

Z Zhu, J Hong, J Zhou - International conference on machine …, 2021 - proceedings.mlr.press
Federated Learning (FL) is a decentralized machine-learning paradigm, in which a global
server iteratively averages the model parameters of local users without accessing their data …

FedGKD: Towards Heterogeneous Federated Learning via Global Knowledge Distillation

D Yao, W Pan, Y Dai, Y Wan, X Ding… - IEEE Transactions …, 2023 - ieeexplore.ieee.org
Federated learning, as one enabling technology of edge intelligence, has gained substantial
attention due to its efficacy in training deep learning models without data privacy and …

Feddq: Communication-efficient federated learning with descending quantization

L Qu, S Song, CY Tsui - GLOBECOM 2022-2022 IEEE Global …, 2022 - ieeexplore.ieee.org
Federated learning (FL) is an emerging learning paradigm without violating users' privacy.
However, large model size and frequent model aggregation cause serious communication …

[PDF][PDF] Class-wise adaptive self distillation for heterogeneous federated learning

Y He, Y Chen, X Yang, Y Zhang… - Proceedings of the 36th …, 2022 - federated-learning.org
The heterogeneity of data distributions among clients (non-IID) has been identified as one of
the key challenges in federated learning. In the local training phase, each client model …