Cloud-based or on-device: An empirical study of mobile deep inference

T Guo - 2018 IEEE International Conference on Cloud …, 2018 - ieeexplore.ieee.org
Modern mobile applications are benefiting significantly from the advancement in deep
learning, eg, implementing real-time image recognition and conversational system. Given a …

Mdinference: Balancing inference accuracy and latency for mobile applications

SS Ogden, T Guo - 2020 IEEE International Conference on …, 2020 - ieeexplore.ieee.org
Deep Neural Networks are allowing mobile devices to incorporate a wide range of features
into user applications. However, the computational complexity of these models makes it …

Characterizing the deep neural networks inference performance of mobile applications

SS Ogden, T Guo - arXiv preprint arXiv:1909.04783, 2019 - arxiv.org
Today's mobile applications are increasingly leveraging deep neural networks to provide
novel features, such as image and speech recognitions. To use a pre-trained deep neural …

Deepx: A software accelerator for low-power deep learning inference on mobile devices

ND Lane, S Bhattacharya, P Georgiev… - 2016 15th ACM/IEEE …, 2016 - ieeexplore.ieee.org
Breakthroughs from the field of deep learning are radically changing how sensor data are
interpreted to extract the high-level information needed by mobile apps. It is critical that the …

{MODI}: Mobile deep inference made efficient by edge computing

SS Ogden, T Guo - USENIX Workshop on Hot Topics in Edge Computing …, 2018 - usenix.org
In this paper, we propose a novel mobile deep inference platform, MODI, that delivers good
inference performance. MODI improves deep learning powered mobile applications …

Deep learning towards mobile applications

J Wang, B Cao, P Yu, L Sun, W Bao… - 2018 IEEE 38th …, 2018 - ieeexplore.ieee.org
Recent years have witnessed an explosive growth of mobile devices. Mobile devices are
permeating every aspect of our daily lives. With the increasing usage of mobile devices and …

Asymo: scalable and efficient deep-learning inference on asymmetric mobile cpus

M Wang, S Ding, T Cao, Y Liu, F Xu - Proceedings of the 27th Annual …, 2021 - dl.acm.org
On-device deep learning (DL) inference has attracted vast interest. Mobile CPUs are the
most common hardware for on-device inference and many inference frameworks have been …

On-device neural net inference with mobile gpus

J Lee, N Chirkov, E Ignasheva, Y Pisarchyk… - arXiv preprint arXiv …, 2019 - arxiv.org
On-device inference of machine learning models for mobile phones is desirable due to its
lower latency and increased privacy. Running such a compute-intensive task solely on the …

Integration of convolutional neural networks in mobile applications

RC Castanyer, S Martínez-Fernández… - 2021 IEEE/ACM 1st …, 2021 - ieeexplore.ieee.org
When building Deep Learning (DL) models, data scientists and software engineers manage
the trade-off between their accuracy, or any other suitable success criteria, and their …

Guidelines and benchmarks for deployment of deep learning models on smartphones as real-time apps

A Sehgal, N Kehtarnavaz - Machine Learning and Knowledge Extraction, 2019 - mdpi.com
Deep learning solutions are being increasingly used in mobile applications. Although there
are many open-source software tools for the development of deep learning solutions, there …