Wireless federated distillation for distributed edge learning with heterogeneous data

JH Ahn, O Simeone, J Kang - 2019 IEEE 30th Annual …, 2019 - ieeexplore.ieee.org
Cooperative training methods for distributed machine learning typically assume noiseless
and ideal communication channels. This work studies some of the opportunities and …

Cooperative learning via federated distillation over fading channels

JH Ahn, O Simeone, J Kang - ICASSP 2020-2020 IEEE …, 2020 - ieeexplore.ieee.org
Cooperative training methods for distributed machine learning are typically based on the
exchange of local gradients or local model parameters. The latter approach is known as …

Federated learning with quantization constraints

N Shlezinger, M Chen, YC Eldar… - ICASSP 2020-2020 …, 2020 - ieeexplore.ieee.org
Traditional deep learning models are trained on centralized servers using labeled sample
data collected from edge devices. This data often includes private information, which the …

Communication efficient federated learning over multiple access channels

WT Chang, R Tandon - arXiv preprint arXiv:2001.08737, 2020 - arxiv.org
In this work, we study the problem of federated learning (FL), where distributed users aim to
jointly train a machine learning model with the help of a parameter server (PS). In each …

Communication-efficient federated distillation

F Sattler, A Marban, R Rischke, W Samek - arXiv preprint arXiv …, 2020 - arxiv.org
Communication constraints are one of the major challenges preventing the wide-spread
adoption of Federated Learning systems. Recently, Federated Distillation (FD), a new …

Decentralized federated learning: Balancing communication and computing costs

W Liu, L Chen, W Zhang - IEEE Transactions on Signal and …, 2022 - ieeexplore.ieee.org
Decentralized stochastic gradient descent (SGD) is a driving engine for decentralized
federated learning (DFL). The performance of decentralized SGD is jointly influenced by …

Federated learning via decentralized dataset distillation in resource-constrained edge environments

R Song, D Liu, DZ Chen, A Festag… - … Joint Conference on …, 2023 - ieeexplore.ieee.org
In federated learning, all networked clients contribute to the model training cooperatively.
However, with model sizes increasing, even sharing the trained partial models often leads to …

Federated learning over wireless device-to-device networks: Algorithms and convergence analysis

H Xing, O Simeone, S Bi - IEEE Journal on Selected Areas in …, 2021 - ieeexplore.ieee.org
The proliferation of Internet-of-Things (IoT) devices and cloud-computing applications over
siloed data centers is motivating renewed interest in the collaborative training of a shared …

CFD: Communication-efficient federated distillation via soft-label quantization and delta coding

F Sattler, A Marban, R Rischke… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
Communication constraints are one of the majorchallenges preventing the wide-spread
adoption of Federated Learning systems. Recently, Federated Distillation (FD), a new …

Federated learning: A signal processing perspective

T Gafni, N Shlezinger, K Cohen… - IEEE Signal …, 2022 - ieeexplore.ieee.org
The dramatic success of deep learning is largely due to the availability of data. Data
samples are often acquired on edge devices, such as smartphones, vehicles, and sensors …