Partitioning dnns for optimizing distributed inference performance on cooperative edge devices: A genetic algorithm approach

J Na, H Zhang, J Lian, B Zhang - Applied Sciences, 2022 - mdpi.com
… in the optimal distributed inferencing performance, has not … DNN partitioning approach to
obtain an optimal distributed … the partitioning problem as a constrained optimization problem …

Coedge: Cooperative dnn inference with adaptive workload partitioning over heterogeneous edge devices

L Zeng, X Chen, Z Zhou, L Yang… - IEEE/ACM Transactions …, 2020 - ieeexplore.ieee.org
cooperative inference over heterogeneous devices to … partitioning on cooperative inference
workflow, and build a constrained programming model on workload distribution optimization. …

Joint optimization with DNN partitioning and resource allocation in mobile edge computing

C Dong, S Hu, X Chen, W Wen - IEEE Transactions on Network …, 2021 - ieeexplore.ieee.org
… computation between IoT devices and edge servers. Our … DNN-based inference acceleration
for multiple IoT devices in … platform with one edge cloud and multiple IoT devices, just …

Cooperative distributed deep neural network deployment with edge computing

CY Yang, JJ Kuo, JP Sheu… - ICC 2021-IEEE …, 2021 - ieeexplore.ieee.org
edge computing system CoopAI to distribute DNN inference over several edge devices with
a novel model partition … Subsequently, we present a new optimization problem to minimize …

Throughput maximization of delay-aware DNN inference in edge computing by exploring DNN model partitioning and inference parallelism

J Li, W Liang, Y Li, Z Xu, X Jia… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
… the cooperation of multiple edge devices with heterogeneous computing capacities for DNN
inference, and studied the dynamic DNN … and a cloud to optimize the experienced inference

DNN surgery: Accelerating DNN inference on the edge through layer partitioning

H Liang, Q Sang, C Hu, D Cheng… - IEEE transactions on …, 2023 - ieeexplore.ieee.org
… the Partitioning optimization of the Edge-Cloud DNN inference … between the edge device
and the cloud at the granularity of … Zhang, “Coedge: Cooperative dnn inference with adaptive …

Collaborative Intelligence: Accelerating Deep Neural Network Inference via DeviceEdge Synergy

N Shan, Z Ye, X Cui - Security and Communication Networks, 2020 - Wiley Online Library
inference through device-edge synergy. We use Cogent automated pruning and partition to
jointly optimize DNN model inference … task cooperatively. It not only makes full use of the rich …

Distributed DNN inference with fine-grained model partitioning in mobile edge computing networks

H Li, X Li, Q Fan, Q He, X Wang… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
… at network edges and orchestrate cooperative DNN inference over … blocks across multiple
computing resources of IoT devices … total inference delay of DNN tasks by optimizing the model …

Towards real-time cooperative deep inference over the cloud and edge end devices

S Zhang, Y Li, X Liu, S Guo, W Wang, J Wang… - Proceedings of the …, 2020 - dl.acm.org
… in cooperative deep inference, we partition the DNN model into two parts and execute di erent
parts on di erent devices (cloud or end devicespartition 3 models with optimized inference

Inference acceleration with adaptive distributed DNN partition over dynamic video stream

J Cao, B Li, M Fan, H Liu - Algorithms, 2022 - mdpi.com
… Instead of optimizing only the single DNN inference process, … , we share the edge’s
computation task with the end device’s … The devices in an interlayer cooperation must follow the …