DeePar: A hybrid device-edge-cloud execution framework for mobile deep learning applications

Y Huang, F Wang, F Wang, J Liu - IEEE INFOCOM 2019-IEEE …, 2019 - ieeexplore.ieee.org
With the deep penetration of mobile devices, more and more mobile deep learning
applications have been widely used in daily life. However, since deep learning tasks are …

Cognitive service in mobile edge computing

C Ding, A Zhou, X Ma, S Wang - 2020 IEEE International …, 2020 - ieeexplore.ieee.org
Cognitive services have revolutionized the way we live, work and interact with the world. In
recent years, deep neural networks have become the mainstream approach in cognitive …

Deep neural network task partitioning and offloading for mobile edge computing

M Gao, W Cui, D Gao, R Shen, J Li… - 2019 IEEE Global …, 2019 - ieeexplore.ieee.org
The surging Deep Neural Network (DNN) based applications are becoming increasingly
popular in mobile computing. However, they impose significant challenges for mobile …

Edge AI: On-demand accelerating deep neural network inference via edge computing

E Li, L Zeng, Z Zhou, X Chen - IEEE Transactions on Wireless …, 2019 - ieeexplore.ieee.org
As a key technology of enabling Artificial Intelligence (AI) applications in 5G era, Deep
Neural Networks (DNNs) have quickly attracted widespread attention. However, it is …

Delay-aware DNN inference throughput maximization in edge computing via jointly exploring partitioning and parallelism

J Li, W Liang, Y Li, Z Xu, X Jia - 2021 IEEE 46th Conference on …, 2021 - ieeexplore.ieee.org
Mobile Edge Computing (MEC) has emerged as a promising paradigm catering to
overwhelming explosions of mobile applications, by offloading the compute-intensive tasks …

Edge intelligence: On-demand deep learning model co-inference with device-edge synergy

E Li, Z Zhou, X Chen - Proceedings of the 2018 workshop on mobile …, 2018 - dl.acm.org
As the backbone technology of machine learning, deep neural networks (DNNs) have have
quickly ascended to the spotlight. Running DNNs on resource-constrained mobile devices …

An adaptive DNN inference acceleration framework with end–edge–cloud collaborative computing

G Liu, F Dai, X Xu, X Fu, W Dou, N Kumar… - Future Generation …, 2023 - Elsevier
Abstract Deep Neural Networks (DNNs) based on intelligent applications have been
intensively deployed on mobile devices. Unfortunately, resource-constrained mobile devices …

Autodidactic neurosurgeon: Collaborative deep inference for mobile edge intelligence via online learning

L Zhang, L Chen, J Xu - Proceedings of the Web Conference 2021, 2021 - dl.acm.org
Recent breakthroughs in deep learning (DL) have led to the emergence of many intelligent
mobile applications and services, but in the meanwhile also pose unprecedented computing …

Throughput maximization of delay-aware DNN inference in edge computing by exploring DNN model partitioning and inference parallelism

J Li, W Liang, Y Li, Z Xu, X Jia… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
Mobile Edge Computing (MEC) has emerged as a promising paradigm catering to
overwhelming explosions of mobile applications, by offloading compute-intensive tasks to …

Computation offloading scheduling for deep neural network inference in mobile computing

Y Duan, J Wu - 2021 IEEE/ACM 29th International Symposium …, 2021 - ieeexplore.ieee.org
The quality of service (QoS) of intelligent applications on mobile devices heavily depends on
the inference speed of Deep Neural Network (DNN) models. Cooperative DNN inference …