Explainable deep learning for efficient and robust pattern recognition: A survey of recent developments

X Bai, X Wang, X Liu, Q Liu, J Song, N Sebe, B Kim - Pattern Recognition, 2021 - Elsevier
Deep learning has recently achieved great success in many visual recognition tasks.
However, the deep neural networks (DNNs) are often perceived as black-boxes, making …

A survey on efficient convolutional neural networks and hardware acceleration

D Ghimire, D Kil, S Kim - Electronics, 2022 - mdpi.com
Over the past decade, deep-learning-based representations have demonstrated remarkable
performance in academia and industry. The learning capability of convolutional neural …

Depgraph: Towards any structural pruning

G Fang, X Ma, M Song, MB Mi… - Proceedings of the …, 2023 - openaccess.thecvf.com
Structural pruning enables model acceleration by removing structurally-grouped parameters
from neural networks. However, the parameter-grouping patterns vary widely across …

A survey of quantization methods for efficient neural network inference

A Gholami, S Kim, Z Dong, Z Yao… - Low-Power Computer …, 2022 - taylorfrancis.com
This chapter provides approaches to the problem of quantizing the numerical values in deep
Neural Network computations, covering the advantages/disadvantages of current methods …

Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks

T Hoefler, D Alistarh, T Ben-Nun, N Dryden… - Journal of Machine …, 2021 - jmlr.org
The growing energy and performance costs of deep learning have driven the community to
reduce the size of neural networks by selectively pruning components. Similarly to their …

What is the state of neural network pruning?

D Blalock, JJ Gonzalez Ortiz… - … of machine learning …, 2020 - proceedings.mlsys.org
Neural network pruning---the task of reducing the size of a network by removing parameters--
-has been the subject of a great deal of work in recent years. We provide a meta-analysis of …

A systematic review on overfitting control in shallow and deep neural networks

MM Bejani, M Ghatee - Artificial Intelligence Review, 2021 - Springer
Shallow neural networks process the features directly, while deep networks extract features
automatically along with the training. Both models suffer from overfitting or poor …

Pruning neural networks without any data by iteratively conserving synaptic flow

H Tanaka, D Kunin, DL Yamins… - Advances in neural …, 2020 - proceedings.neurips.cc
Pruning the parameters of deep neural networks has generated intense interest due to
potential savings in time, memory and energy both during training and at test time. Recent …

Hrank: Filter pruning using high-rank feature map

M Lin, R Ji, Y Wang, Y Zhang… - Proceedings of the …, 2020 - openaccess.thecvf.com
Neural network pruning offers a promising prospect to facilitate deploying deep neural
networks on resource-limited devices. However, existing methods are still challenged by the …

Ghostnet: More features from cheap operations

K Han, Y Wang, Q Tian, J Guo… - Proceedings of the …, 2020 - openaccess.thecvf.com
Deploying convolutional neural networks (CNNs) on embedded devices is difficult due to the
limited memory and computation resources. The redundancy in feature maps is an important …