Data-free knowledge distillation for heterogeneous federated learning

Z Zhu, J Hong, J Zhou - International conference on machine …, 2021 - proceedings.mlr.press
Federated Learning (FL) is a decentralized machine-learning paradigm, in which a global
server iteratively averages the model parameters of local users without accessing their data …

Local-global knowledge distillation in heterogeneous federated learning with non-iid data

D Yao, W Pan, Y Dai, Y Wan, X Ding, H Jin… - arXiv preprint arXiv …, 2021 - arxiv.org
Federated learning enables multiple clients to collaboratively learn a global model by
periodically aggregating the clients' models without transferring the local data. However, due …

Meta knowledge condensation for federated learning

P Liu, X Yu, JT Zhou - arXiv preprint arXiv:2209.14851, 2022 - arxiv.org
Existing federated learning paradigms usually extensively exchange distributed models at a
central solver to achieve a more powerful model. However, this would incur severe …

Towards model agnostic federated learning using knowledge distillation

A Afonin, SP Karimireddy - arXiv preprint arXiv:2110.15210, 2021 - arxiv.org
Is it possible to design an universal API for federated learning using which an ad-hoc group
of data-holders (agents) collaborate with each other and perform federated learning? Such …

Knowledge distillation for federated learning: a practical guide

A Mora, I Tenison, P Bellavista, I Rish - arXiv preprint arXiv:2211.04742, 2022 - arxiv.org
Federated Learning (FL) enables the training of Deep Learning models without centrally
collecting possibly sensitive raw data. This paves the way for stronger privacy guarantees …

Towards data-independent knowledge transfer in model-heterogeneous federated learning

J Zhang, S Guo, J Guo, D Zeng, J Zhou… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Federated Distillation (FD) extends classic Federated Learning (FL) to a more general
training framework that enables model-heterogeneous collaborative learning by Knowledge …

On the convergence of clustered federated learning

J Ma, G Long, T Zhou, J Jiang, C Zhang - arXiv preprint arXiv:2202.06187, 2022 - arxiv.org
Knowledge sharing and model personalization are essential components to tackle the non-
IID challenge in federated learning (FL). Most existing FL methods focus on two extremes: 1) …

Quped: Quantized personalization via distillation with applications to federated learning

K Ozkara, N Singh, D Data… - Advances in Neural …, 2021 - proceedings.neurips.cc
Traditionally, federated learning (FL) aims to train a single global model while
collaboratively using multiple clients and a server. Two natural challenges that FL algorithms …

Fedbe: Making bayesian model ensemble applicable to federated learning

HY Chen, WL Chao - arXiv preprint arXiv:2009.01974, 2020 - arxiv.org
Federated learning aims to collaboratively train a strong global model by accessing users'
locally trained models but not their own data. A crucial step is therefore to aggregate local …

Gpt-fl: Generative pre-trained model-assisted federated learning

T Zhang, T Feng, S Alam, D Dimitriadis… - arXiv preprint arXiv …, 2023 - arxiv.org
In this work, we propose GPT-FL, a generative pre-trained model-assisted federated
learning (FL) framework. At its core, GPT-FL leverages generative pre-trained models to …