L Li, D Shi, R Hou, H Li, M Pan… - IEEE INFOCOM 2021 …, 2021 - ieeexplore.ieee.org
Recent advances in machine learning, wireless communication, and mobile hardware technologies promisingly enable federated learning (FL) over massive mobile edge devices …
Z Wang, M Wen, Y Xu, Y Zhou, JH Wang… - Journal of Systems …, 2023 - Elsevier
Nowadays, the training data and neural network models are getting increasingly large. The training time of deep learning will become unbearably long on a single machine. To reduce …
Abstract 5 G is the fifth generation of cellular networks. It enables billions of connected devices to gather and share information in real time; a key facilitator in Industrial Internet of …
We introduce a framework-Artemis-to tackle the problem of learning in a distributed or federated setting with communication constraints and device partial participation. Several …
Federated learning (FL) over mobile devices has fostered numerous intriguing applications/services, many of which are delay-sensitive. In this paper, we propose a service …
Noisy gradient algorithms have emerged as one of the most popular algorithms for distributed optimization with massive data. Choosing proper step-size schedules is an …
Facing the upcoming era of Internet-of-Things and connected intelligence, efficient information processing, computation, and communication design becomes a key challenge …
The development of AI applications, especially in large-scale wireless networks, is growing exponentially, alongside the size and complexity of the architectures used. Particularly …
We propose Adaptive Compressed Gradient Descent (AdaCGD)-a novel optimization algorithm for communication-efficient training of supervised machine learning models with …