作者
Min Xue, Huaming Wu, Ruidong Li, Minxian Xu, Pengfei Jiao
发表日期
2021/9/10
期刊
IEEE Transactions on Green Communications and Networking
卷号
6
期号
1
页码范围
248-264
出版商
IEEE
简介
With the popularity of mobile devices, intelligent applications, e.g., face recognition, intelligent voice assistant, and gesture recognition, have been widely used in our daily lives. However, due to the lack of computing capacities, it is difficult for mobile devices to support complex Deep Neural Network (DNN) inference. To alleviate the pressure on these devices, traditional methods usually upload part of the DNN model to a cloud server and perform a DNN query after uploading an entire DNN model. To achieve real-time DNN query, we consider the collaboration between local, edge and cloud, and perform DNN query when uploading DNN partitions. In this paper, we propose an Efficient offloading scheme for DNN Inference Acceleration (EosDNN) in a local-edge-cloud collaborative environment, where the DNN inference acceleration is mainly embodied in the optimization of migration delay and realization of real …
引用总数
学术搜索中的文章