Optimising resource management for embedded machine learning

L Xun, L Tran-Thanh, BM Al-Hashimi… - … Design, Automation & …, 2020 - ieeexplore.ieee.org
Machine learning inference is increasingly being executed locally on mobile and embedded
platforms, due to the clear advantages in latency, privacy and connectivity. In this paper, we …

Optimising resource management for embedded machine learning

L Xun, L Tran-Thanh, BM Al-Hashimi… - Proceedings of the 23rd …, 2020 - dl.acm.org
Machine learning inference is increasingly being executed locally on mobile and embedded
platforms, due to the clear advantages in latency, privacy and connectivity. In this paper, we …

Optimising Resource Management for Embedded Machine Learning

L Xun, L Tran-Thanh, BM Al-Hashimi… - arXiv preprint arXiv …, 2021 - arxiv.org
Machine learning inference is increasingly being executed locally on mobile and embedded
platforms, due to the clear advantages in latency, privacy and connectivity. In this paper, we …

Optimising Resource Management for Embedded Machine Learning

L Xun, L Tran-Thanh, BM Al-Hashimi… - arXiv e …, 2021 - ui.adsabs.harvard.edu
Abstract Machine learning inference is increasingly being executed locally on mobile and
embedded platforms, due to the clear advantages in latency, privacy and connectivity. In this …

[PDF][PDF] Optimising Resource Management for Embedded Machine Learning

L Xun, L Tran-Thanh, BM Al-Hashimi, GV Merrett - scholar.archive.org
Machine learning inference is increasingly being executed locally on mobile and embedded
platforms, due to the clear advantages in latency, privacy and connectivity. In this paper, we …

[PDF][PDF] Optimising Resource Management for Embedded Machine Learning

L Xun, L Tran-Thanh, BM Al-Hashimi, GV Merrett - past.date-conference.com
Machine learning inference is increasingly being executed locally on mobile and embedded
platforms, due to the clear advantages in latency, privacy and connectivity. In this paper, we …

Optimising resource management for embedded machine learning

L Xun, L Tran-Thanh, B Al-Hashimi, G Merrett - 2020 - eprints.soton.ac.uk
Machine learning inference is increasingly being executed locally on mobile and embedded
platforms, due to the clear advantages in latency, privacy and connectivity. In this paper, we …