Distributed artificial intelligence empowered by end-edge-cloud computing: A survey

S Duan, D Wang, J Ren, F Lyu, Y Zhang… - … Surveys & Tutorials, 2022 - ieeexplore.ieee.org
As the computing paradigm shifts from cloud computing to end-edge-cloud computing, it
also supports artificial intelligence evolving from a centralized manner to a distributed one …

Distributed learning for wireless communications: Methods, applications and challenges

L Qian, P Yang, M Xiao, OA Dobre… - IEEE Journal of …, 2022 - ieeexplore.ieee.org
With its privacy-preserving and decentralized features, distributed learning plays an
irreplaceable role in the era of wireless networks with a plethora of smart terminals, an …

Federated learning over noisy channels: Convergence analysis and design examples

X Wei, C Shen - IEEE Transactions on Cognitive …, 2022 - ieeexplore.ieee.org
Does Federated Learning (FL) work when both uplink and downlink communications have
errors? How much communication noise can FL handle and what is its impact on the …

Federated learning with sparsified model perturbation: Improving accuracy under client-level differential privacy

R Hu, Y Guo, Y Gong - IEEE Transactions on Mobile Computing, 2023 - ieeexplore.ieee.org
Federated learning (FL) that enables edge devices to collaboratively learn a shared model
while keeping their training data locally has received great attention recently and can protect …

Ac-sgd: Adaptively compressed sgd for communication-efficient distributed learning

G Yan, T Li, SL Huang, T Lan… - IEEE Journal on Selected …, 2022 - ieeexplore.ieee.org
Gradient compression (eg, gradient quantization and gradient sparsification) is a core
technique in reducing communication costs in distributed learning systems. The recent trend …

Learned gradient compression for distributed deep learning

L Abrahamyan, Y Chen, G Bekoulis… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
Training deep neural networks on large datasets containing high-dimensional data requires
a large amount of computation. A solution to this problem is data-parallel distributed training …

Deepreduce: A sparse-tensor communication framework for federated deep learning

H Xu, K Kostopoulou, A Dutta, X Li… - Advances in …, 2021 - proceedings.neurips.cc
Sparse tensors appear frequently in federated deep learning, either as a direct artifact of the
deep neural network's gradients, or as a result of an explicit sparsification process. Existing …

Enabling all in-edge deep learning: A literature review

P Joshi, M Hasanuzzaman, C Thapa, H Afli… - IEEE Access, 2023 - ieeexplore.ieee.org
In recent years, deep learning (DL) models have demonstrated remarkable achievements
on non-trivial tasks such as speech recognition, image processing, and natural language …

Communication compression techniques in distributed deep learning: A survey

Z Wang, M Wen, Y Xu, Y Zhou, JH Wang… - Journal of Systems …, 2023 - Elsevier
Nowadays, the training data and neural network models are getting increasingly large. The
training time of deep learning will become unbearably long on a single machine. To reduce …

FedOComp: Two-timescale online gradient compression for over-the-air federated learning

Y Xue, L Su, VKN Lau - IEEE Internet of Things Journal, 2022 - ieeexplore.ieee.org
Federated learning (FL) is a machine learning framework, where multiple distributed edge
Internet of Things (IoT) devices collaboratively train a model under the orchestration of a …