Wireless channel adaptive DNN split inference for resource-constrained edge devices

J Lee, H Lee, W Choi - IEEE Communications Letters, 2023 - ieeexplore.ieee.org
Split inference facilitates deep neural network (DNN) inference tasks at resource-
constrained edge devices. However, a pre-determined split configuration of a DNN limits the …

Dynamic split computing for efficient deep edge intelligence

A Bakhtiarnia, N Milošević, Q Zhang… - Icassp 2023-2023 …, 2023 - ieeexplore.ieee.org
Deploying deep neural networks (DNNs) on IoT and mobile devices is a challenging task
due to their limited computational resources. Thus, demanding tasks are often entirely …

Ensuring Bidirectional Privacy on Wireless Split Inference Systems

CC Sa, LC Cheng, HH Chung, TC Chiu… - IEEE Wireless …, 2024 - ieeexplore.ieee.org
With the advances of machine learning, edge computing, and wireless communications, split
inference has tracked more and more attention as a versatile inference paradigm. Split …

Improving device-edge cooperative inference of deep learning via 2-step pruning

W Shi, Y Hou, S Zhou, Z Niu, Y Zhang… - IEEE INFOCOM 2019 …, 2019 - ieeexplore.ieee.org
Deep neural networks (DNNs) are state-of-the-art solutions for many machine learning
applications, and have been widely used on mobile devices. Running DNNs on …

Packet-loss-tolerant split inference for delay-sensitive deep learning in lossy wireless networks

S Itahara, T Nishio, K Yamamoto - 2021 IEEE Global …, 2021 - ieeexplore.ieee.org
The distributed inference framework is an emerging technology for real-time applications
empowered by cutting-edge deep machine learning (ML) on resource-constrained Internet …

Optimal model placement and online model splitting for device-edge co-inference

J Yan, S Bi, YJA Zhang - IEEE Transactions on Wireless …, 2022 - ieeexplore.ieee.org
Device-edge co-inference opens up new possibilities for resource-constrained wireless
devices (WDs) to execute deep neural network (DNN)-based applications with heavy …

[HTML][HTML] Split computing: DNN inference partition with load balancing in IoT-edge platform for beyond 5G

J Karjee, P Naik, K Anand, VN Bhargav - Measurement: Sensors, 2022 - Elsevier
In the era of beyond 5G technology, it is expected that more and more applications can use
deep neural network (DNN) models for different purposes with minimum inference time …

Progressive feature transmission for split inference at the wireless edge

Q Lan, Q Zeng, P Popovski, D Gündüz… - arXiv preprint arXiv …, 2021 - arxiv.org
In edge inference, an edge server provides remote-inference services to edge devices. This
requires the edge devices to upload high-dimensional features of data samples over …

Dynamic encoding and decoding of information for split learning in mobile-edge computing: Leveraging information bottleneck theory

O Alhussein, M Wei, A Akhavain - GLOBECOM 2023-2023 …, 2023 - ieeexplore.ieee.org
Split learning is a privacy-preserving distributed learning paradigm in which an ML model
(eg, a neural network) is split into two parts (ie, an encoder and a decoder). The encoder …

Energy-efficient cooperative inference via adaptive deep neural network splitting at the edge

I Labriji, M Merluzzi, FE Airod… - ICC 2023-IEEE …, 2023 - ieeexplore.ieee.org
Learning and inference at the edge is all about distilling, exchanging, and processing data
in a cooperative and distributed way, to achieve challenging trade-offs involving energy …