A survey on efficient convolutional neural networks and hardware acceleration

D Ghimire, D Kil, S Kim - Electronics, 2022 - mdpi.com
Over the past decade, deep-learning-based representations have demonstrated remarkable
performance in academia and industry. The learning capability of convolutional neural …

Pruning deep neural networks for green energy-efficient models: A survey

J Tmamna, EB Ayed, R Fourati, M Gogate, T Arslan… - Cognitive …, 2024 - Springer
Over the past few years, larger and deeper neural network models, particularly convolutional
neural networks (CNNs), have consistently advanced state-of-the-art performance across …

Channel permutations for n: m sparsity

J Pool, C Yu - Advances in neural information processing …, 2021 - proceedings.neurips.cc
We introduce channel permutations as a method to maximize the accuracy of N: M sparse
networks. N: M sparsity requires N out of M consecutive elements to be zero and has been …

Fire together wire together: A dynamic pruning approach with self-supervised mask prediction

S Elkerdawy, M Elhoushi, H Zhang… - Proceedings of the …, 2022 - openaccess.thecvf.com
Dynamic model pruning is a recent direction that allows for the inference of a different sub-
network for each input sample during deployment. However, current dynamic methods rely …

Updp: A unified progressive depth pruner for cnn and vision transformer

J Liu, D Tang, Y Huang, L Zhang, X Zeng, D Li… - Proceedings of the …, 2024 - ojs.aaai.org
Traditional channel-wise pruning methods by reducing network channels struggle to
effectively prune efficient CNN models with depth-wise convolutional layers and certain …

An accelerating convolutional neural networks via a 2D entropy based-adaptive filter search method for image recognition

C Li, H Li, G Gao, Z Liu, P Liu - Applied Soft Computing, 2023 - Elsevier
The success of CNNs for various vision tasks has been accompanied by a significant
increase in required FLOPs and parameter quantities, which has impeded the deployment of …

Soks: Automatic searching of the optimal kernel shapes for stripe-wise network pruning

G Liu, K Zhang, M Lv - IEEE Transactions on Neural Networks …, 2022 - ieeexplore.ieee.org
In spite of the remarkable performance, deep convolutional neural networks (CNNs) are
typically over-parameterized and computationally expensive. Network pruning has become …

DepthShrinker: a new compression paradigm towards boosting real-hardware efficiency of compact neural networks

Y Fu, H Yang, J Yuan, M Li, C Wan… - International …, 2022 - proceedings.mlr.press
Efficient deep neural network (DNN) models equipped with compact operators (eg,
depthwise convolutions) have shown great potential in reducing DNNs' theoretical …

Agic: Approximate gradient inversion attack on federated learning

J Xu, C Hong, J Huang, LY Chen… - 2022 41st International …, 2022 - ieeexplore.ieee.org
Federated learning is a private-by-design distributed learning paradigm where clients train
local models on their own data before a central server aggregates their local updates to …

Class-aware pruning for efficient neural networks

M Jiang, J Wang, A Eldebiky, X Yin… - … , Automation & Test …, 2024 - ieeexplore.ieee.org
Deep neural networks (DNNs) have demonstrated remarkable success in various fields.
However, the large number of floating-point operations (FLOPs) in DNNs poses challenges …