High-dimensional stochastic gradient quantization for communication-efficient edge learning

Y Du, S Yang, K Huang - IEEE transactions on signal …, 2020 - ieeexplore.ieee.org
Edge machine learning involves the deployment of learning algorithms at the wireless
network edge so as to leverage massive mobile data for enabling intelligent applications …

Machine learning paradigms for next-generation wireless networks

C Jiang, H Zhang, Y Ren, Z Han… - IEEE Wireless …, 2016 - ieeexplore.ieee.org
Next-generation wireless networks are expected to support extremely high data rates and
radically new applications, which require a new wireless radio technology paradigm. The …

Energy-efficient distributed machine learning at wireless edge with device-to-device communication

R Hu, Y Guo, Y Gong - ICC 2022-IEEE International …, 2022 - ieeexplore.ieee.org
This paper considers a federated edge learning (FEL) system where a base station (BS)
coordinates a set of edge devices to train a shared machine learning model collaboratively …

Federated learning at the network edge: When not all nodes are created equal

F Malandrino, CF Chiasserini - IEEE Communications …, 2021 - ieeexplore.ieee.org
Under the federated learning paradigm, a set of nodes can cooperatively train a machine
learning model with the help of a centralized server. Such a server is also tasked with …

Knowledge-aided federated learning for energy-limited wireless networks

Z Chen, W Yi, Y Liu… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
The conventional model aggregation-based federated learning (FL) approach requires all
local models to have the same architecture, which fails to support practical scenarios with …

Scheduling and aggregation design for asynchronous federated learning over wireless networks

CH Hu, Z Chen, EG Larsson - IEEE Journal on Selected Areas …, 2023 - ieeexplore.ieee.org
Federated Learning (FL) is a collaborative machine learning (ML) framework that combines
on-device training and server-based aggregation to train a common ML model among …

Laplacian matrix sampling for communication-efficient decentralized learning

CC Chiu, X Zhang, T He, S Wang… - IEEE Journal on …, 2023 - ieeexplore.ieee.org
We consider the problem of training a given machine learning model by decentralized
parallel stochastic gradient descent over training data distributed across multiple nodes …

Communication-efficient coded distributed multi-task learning

H Tang, H Hu, K Yuan, Y Wu - 2021 IEEE Global …, 2021 - ieeexplore.ieee.org
Consider a distributed multi-task learning (MTL) framework where the distributed users first
train their own models based on the local data and then send the local updates to the server …

Asynchronous multi-model federated learning over wireless networks: Theory, modeling, and optimization

ZL Chang, S Hosseinalipour, M Chiang… - arXiv preprint arXiv …, 2023 - arxiv.org
Federated learning (FL) has emerged as a key technique for distributed machine learning
(ML). Most literature on FL has focused on systems with (i) ML model training for a single …

Federated learning: Strategies for improving communication efficiency

J Konečný, HB McMahan, FX Yu, P Richtárik… - arXiv preprint arXiv …, 2016 - arxiv.org
Federated Learning is a machine learning setting where the goal is to train a high-quality
centralized model while training data remains distributed over a large number of clients …