Edge intelligence for smart grid: a survey on application potentials

HB Gooi, T Wang, Y Tang - CSEE Journal of Power and Energy …, 2023 - ieeexplore.ieee.org
With the booming of artificial intelligence (AI), Internet of Things (IoT), and high-speed
communication technology, integrating these technologies to innovate the smart grid (SG) …

DDPQN: An efficient DNN offloading strategy in local-edge-cloud collaborative environments

M Xue, H Wu, G Peng, K Wolter - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
With the rapid development of the Internet of Things (IoT) and communication technology,
Deep Neural Network (DNN) applications like computer vision, can now be widely used in …

Early-Exit Deep Neural Network-A Comprehensive Survey

H Rahmath P, V Srivastava, K Chaurasia… - ACM Computing …, 2024 - dl.acm.org
Deep neural networks (DNNs) typically have a single exit point that makes predictions by
running the entire stack of neural layers. Since not all inputs require the same amount of …

A survey on deep learning for challenged networks: Applications and trends

K Bochie, MS Gilbert, L Gantert, MSM Barbosa… - Journal of Network and …, 2021 - Elsevier
Computer networks are dealing with growing complexity, given the ever-increasing volume
of data produced by all sorts of network nodes. Performance improvements are a non-stop …

[HTML][HTML] An open source framework based on Kafka-ML for Distributed DNN inference over the Cloud-to-Things continuum

DR Torres, C Martin, B Rubio, M Diaz - Journal of Systems Architecture, 2021 - Elsevier
The current dependency of Artificial Intelligence (AI) systems on Cloud computing implies
higher transmission latency and bandwidth consumption. Moreover, it challenges the real …

On the impact of deep neural network calibration on adaptive edge offloading for image classification

RG Pacheco, RS Couto, O Simeone - Journal of Network and Computer …, 2023 - Elsevier
Edge devices can offload deep neural network (DNN) inference to the cloud to overcome
energy or processing constraints. Nevertheless, offloading adds communication delay …

OfpCNN: On-Demand Fine-Grained Partitioning for CNN Inference Acceleration in Heterogeneous Devices

L Yang, C Zheng, X Shen, G Xie - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Collaborative inference is a promising method for balancing the limited computational power
of Internet of Things (IoT) devices with the huge computational demands of convolutional …

Calibration-aided edge inference offloading via adaptive model partitioning of deep neural networks

RG Pacheco, RS Couto… - ICC 2021-IEEE …, 2021 - ieeexplore.ieee.org
Mobile devices can offload deep neural network (DNN)-based inference to the cloud,
overcoming local hardware and energy limitations. However, offloading adds …

Managing and deploying distributed and deep neural models through Kafka-ML in the cloud-to-things continuum

A Carnero, C Martín, DR Torres, D Garrido… - IEEE …, 2021 - ieeexplore.ieee.org
The Internet of Things (IoT) is constantly growing, generating an uninterrupted data stream
pipeline to monitor physical world information. Hence, Artificial Intelligence (AI) continuously …

ClassyNet: Class-Aware Early Exit Neural Networks for Edge Devices

M Ayyat, T Nadeem, B Krawczyk - IEEE Internet of Things …, 2023 - ieeexplore.ieee.org
Edge-based and IoT devices have seen phenomenal growth in recent years, driven by the
surge in demand for emerging applications that leverage machine learning models, such as …