Pruning deep neural networks for green energy-efficient models: A survey

J Tmamna, EB Ayed, R Fourati, M Gogate, T Arslan… - Cognitive …, 2024 - Springer
Over the past few years, larger and deeper neural network models, particularly convolutional
neural networks (CNNs), have consistently advanced state-of-the-art performance across …

Expediting large-scale vision transformer for dense prediction without fine-tuning

W Liang, Y Yuan, H Ding, X Luo… - Advances in …, 2022 - proceedings.neurips.cc
Vision transformers have recently achieved competitive results across various vision tasks
but still suffer from heavy computation costs when processing a large number of tokens …

Accelerate cnns from three dimensions: A comprehensive pruning framework

W Wang, M Chen, S Zhao, L Chen… - International …, 2021 - proceedings.mlr.press
Most neural network pruning methods, such as filter-level and layer-level prunings, prune
the network model along one dimension (depth, width, or resolution) solely to meet a …

Efficient layer compression without pruning

J Wu, D Zhu, L Fang, Y Deng… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Network pruning is one of the chief means for improving the computational efficiency of
Deep Neural Networks (DNNs). Pruning-based methods generally discard network kernels …

Mope-clip: Structured pruning for efficient vision-language models with module-wise pruning error metric

H Lin, H Bai, Z Liu, L Hou, M Sun… - Proceedings of the …, 2024 - openaccess.thecvf.com
Vision-language pre-trained models have achieved impressive performance on various
downstream tasks. However their large model sizes hinder their utilization on platforms with …

Distributed Machine Learning in Edge Computing: Challenges, Solutions and Future Directions

J Tu, L Yang, J Cao - ACM Computing Surveys, 2024 - dl.acm.org
Distributed machine learning on edges is widely used in intelligent transportation, smart
home, industrial manufacturing, and underground pipe network monitoring to achieve low …

Soft person reidentification network pruning via blockwise adjacent filter decaying

X Wang, Z Zheng, Y He, F Yan, Z Zeng… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
Deep learning has shown significant successes in person reidentification (re-id) tasks.
However, most existing works focus on discriminative feature learning and impose complex …

One less reason for filter pruning: Gaining free adversarial robustness with structured grouped kernel pruning

SH Zhong, Z You, J Zhang, S Zhao… - Advances in neural …, 2023 - proceedings.neurips.cc
Densely structured pruning methods utilizing simple pruning heuristics can deliver
immediate compression and acceleration benefits with acceptable benign performances …

Revisit kernel pruning with lottery regulated grouped convolutions

S Zhong - 2022 - rave.ohiolink.edu
Structured pruning methods which are capable of delivering a densely pruned network are
among the most popular techniques in the realm of neural network pruning, where most …

COP: customized correlation-based Filter level pruning method for deep CNN compression

W Wang, Z Yu, C Fu, D Cai, X He - Neurocomputing, 2021 - Elsevier
As deep CNNs get larger, it becomes more challenging to deploy them on resource-
restricted mobile devices. Filter-level pruning is one of the most popular methods to …