Communication-efficient on-device machine learning: Federated distillation and augmentation under non-iid private data

E Jeong, S Oh, H Kim, J Park, M Bennis… - arXiv preprint arXiv …, 2018 - arxiv.org
On-device machine learning (ML) enables the training process to exploit a massive amount
of user-generated private data samples. To enjoy this benefit, inter-device communication …

Fedmix: Approximation of mixup under mean augmented federated learning

T Yoon, S Shin, SJ Hwang, E Yang - arXiv preprint arXiv:2107.00233, 2021 - arxiv.org
Federated learning (FL) allows edge devices to collectively learn a model without directly
sharing data within each device, thus preserving privacy and eliminating the need to store …

Jointly learning from decentralized (federated) and centralized data to mitigate distribution shift

S Augenstein, A Hard, K Partridge… - arXiv preprint arXiv …, 2021 - arxiv.org
With privacy as a motivation, Federated Learning (FL) is an increasingly used paradigm
where learning takes place collectively on edge devices, each with a cache of user …

Federated learning with non-iid data

Y Zhao, M Li, L Lai, N Suda, D Civin… - arXiv preprint arXiv …, 2018 - arxiv.org
Federated learning enables resource-constrained edge compute devices, such as mobile
phones and IoT devices, to learn a shared model for prediction, while keeping the training …

Efficient and private federated learning with partially trainable networks

H Sidahmed, Z Xu, A Garg, Y Cao, M Chen - arXiv preprint arXiv …, 2021 - arxiv.org
Federated learning is used for decentralized training of machine learning models on a large
number (millions) of edge mobile devices. It is challenging because mobile devices often …

DP-DyLoRA: Fine-Tuning Transformer-Based Models On-Device under Differentially Private Federated Learning using Dynamic Low-Rank Adaptation

J Xu, K Saravanan, R van Dalen, H Mehmood… - arXiv preprint arXiv …, 2024 - arxiv.org
Federated learning (FL) allows clients in an Internet of Things (IoT) system to collaboratively
train a global model without sharing their local data with a server. However, clients' …

Hiding in the crowd: Federated data augmentation for on-device learning

E Jeong, S Oh, J Park, H Kim, M Bennis… - IEEE Intelligent …, 2020 - ieeexplore.ieee.org
To cope with the lack of on-device machine learning samples, this article presents a
distributed data augmentation algorithm, coined federated data augmentation (FAug). In …

Precision-weighted federated learning

J Reyes, L Di Jorio, C Low-Kam… - arXiv preprint arXiv …, 2021 - arxiv.org
Federated Learning using the Federated Averaging algorithm has shown great advantages
for large-scale applications that rely on collaborative learning, especially when the training …

Scalable privacy-preserving distributed learning

D Froelicher, JR Troncoso-Pastoriza, A Pyrgelis… - arXiv preprint arXiv …, 2020 - arxiv.org
In this paper, we address the problem of privacy-preserving distributed learning and the
evaluation of machine-learning models by analyzing it in the widespread MapReduce …

Hidenseek: Federated lottery ticket via server-side pruning and sign supermask

AK Vallapuram, P Zhou, YD Kwon, LH Lee… - arXiv preprint arXiv …, 2022 - arxiv.org
Federated learning alleviates the privacy risk in distributed learning by transmitting only the
local model updates to the central server. However, it faces challenges including statistical …