A survey of methods for low-power deep learning and computer vision

A Goel, C Tung, YH Lu… - 2020 IEEE 6th World …, 2020 - ieeexplore.ieee.org
Deep neural networks (DNNs) are successful in many computer vision tasks. However, the
most accurate DNNs require millions of parameters and operations, making them energy …

Quantization and deployment of deep neural networks on microcontrollers

PE Novac, G Boukli Hacene, A Pegatoquet… - Sensors, 2021 - mdpi.com
Embedding Artificial Intelligence onto low-power devices is a challenging task that has been
partly overcome with recent advances in machine learning and hardware design. Presently …

Autopruner: An end-to-end trainable filter pruning method for efficient deep model inference

JH Luo, J Wu - Pattern Recognition, 2020 - Elsevier
Channel pruning is an important method to speed up CNN model's inference. Previous filter
pruning algorithms regard importance evaluation and model fine-tuning as two independent …

Graph neural networks: Architectures, stability, and transferability

L Ruiz, F Gama, A Ribeiro - Proceedings of the IEEE, 2021 - ieeexplore.ieee.org
Graph neural networks (GNNs) are information processing architectures for signals
supported on graphs. They are presented here as generalizations of convolutional neural …

Rubiksnet: Learnable 3d-shift for efficient video action recognition

L Fan, S Buch, G Wang, R Cao, Y Zhu… - … on Computer Vision, 2020 - Springer
Video action recognition is a complex task dependent on modeling spatial and temporal
context. Standard approaches rely on 2D or 3D convolutions to process such context …

Deep geometric knowledge distillation with graphs

C Lassance, M Bontonou, GB Hacene… - ICASSP 2020-2020 …, 2020 - ieeexplore.ieee.org
In most cases deep learning architectures are trained disregarding the amount of operations
and energy consumption. However, some applications, like embedded systems, can be …

Rethinking weight decay for efficient neural network pruning

H Tessier, V Gripon, M Léonardon, M Arzel… - Journal of …, 2022 - mdpi.com
Introduced in the late 1980s for generalization purposes, pruning has now become a staple
for compressing deep neural networks. Despite many innovations in recent decades …

Complexity-Driven Model Compression for Resource-constrained Deep Learning on Edge

M Zawish, S Davy, L Abraham - IEEE Transactions on Artificial …, 2024 - ieeexplore.ieee.org
Recent advances in Artificial Intelligence (AI) on the Internet of Things (IoT) devices have
realized Edge AI in several applications by enabling low latency and energy efficiency …

Complexity-driven cnn compression for resource-constrained edge ai

M Zawish, S Davy, L Abraham - arXiv preprint arXiv:2208.12816, 2022 - arxiv.org
Recent advances in Artificial Intelligence (AI) on the Internet of Things (IoT)-enabled network
edge has realized edge intelligence in several applications such as smart agriculture, smart …

Towards a configurable and non-hierarchical search space for NAS

M Perrin, W Guicquero, B Paille, G Sicard - Neural Networks, 2024 - Elsevier
Abstract Neural Architecture Search (NAS) outperforms handcrafted Neural Network (NN)
design. However, current NAS methods generally use hard-coded search spaces, and …