With its privacy-preserving and decentralized features, distributed learning plays an irreplaceable role in the era of wireless networks with a plethora of smart terminals, an …
X Wei, C Shen - IEEE Transactions on Cognitive …, 2022 - ieeexplore.ieee.org
Does Federated Learning (FL) work when both uplink and downlink communications have errors? How much communication noise can FL handle and what is its impact on the …
R Hu, Y Guo, Y Gong - IEEE Transactions on Mobile Computing, 2023 - ieeexplore.ieee.org
Federated learning (FL) that enables edge devices to collaboratively learn a shared model while keeping their training data locally has received great attention recently and can protect …
G Yan, T Li, SL Huang, T Lan… - IEEE Journal on Selected …, 2022 - ieeexplore.ieee.org
Gradient compression (eg, gradient quantization and gradient sparsification) is a core technique in reducing communication costs in distributed learning systems. The recent trend …
Training deep neural networks on large datasets containing high-dimensional data requires a large amount of computation. A solution to this problem is data-parallel distributed training …
Sparse tensors appear frequently in federated deep learning, either as a direct artifact of the deep neural network's gradients, or as a result of an explicit sparsification process. Existing …
In recent years, deep learning (DL) models have demonstrated remarkable achievements on non-trivial tasks such as speech recognition, image processing, and natural language …
Z Wang, M Wen, Y Xu, Y Zhou, JH Wang… - Journal of Systems …, 2023 - Elsevier
Nowadays, the training data and neural network models are getting increasingly large. The training time of deep learning will become unbearably long on a single machine. To reduce …
Y Xue, L Su, VKN Lau - IEEE Internet of Things Journal, 2022 - ieeexplore.ieee.org
Federated learning (FL) is a machine learning framework, where multiple distributed edge Internet of Things (IoT) devices collaboratively train a model under the orchestration of a …