A survey on deep neural network pruning: Taxonomy, comparison, analysis, and recommendations

H Cheng, M Zhang, JQ Shi - IEEE Transactions on Pattern …, 2024 - ieeexplore.ieee.org
Modern deep neural networks, particularly recent large language models, come with
massive model sizes that require significant computational and storage resources. To …

Recent advances on neural network pruning at initialization

H Wang, C Qin, Y Bai, Y Zhang, Y Fu - arXiv preprint arXiv:2103.06460, 2021 - arxiv.org
Neural network pruning typically removes connections or neurons from a pretrained
converged model; while a new pruning paradigm, pruning at initialization (PaI), attempts to …

Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks

T Hoefler, D Alistarh, T Ben-Nun, N Dryden… - Journal of Machine …, 2021 - jmlr.org
The growing energy and performance costs of deep learning have driven the community to
reduce the size of neural networks by selectively pruning components. Similarly to their …

Can subnetwork structure be the key to out-of-distribution generalization?

D Zhang, K Ahuja, Y Xu, Y Wang… - … on Machine Learning, 2021 - proceedings.mlr.press
Can models with particular structure avoid being biased towards spurious correlation in out-
of-distribution (OOD) generalization? Peters et al.(2016) provides a positive answer for …

Advancing model pruning via bi-level optimization

Y Zhang, Y Yao, P Ram, P Zhao… - Advances in …, 2022 - proceedings.neurips.cc
The deployment constraints in practical applications necessitate the pruning of large-scale
deep learning models, ie, promoting their weight sparsity. As illustrated by the Lottery Ticket …

Training your sparse neural network better with any mask

AK Jaiswal, H Ma, T Chen, Y Ding… - … on Machine Learning, 2022 - proceedings.mlr.press
Pruning large neural networks to create high-quality, independently trainable sparse masks,
which can maintain similar performance to their dense counterparts, is very desirable due to …

Winning the lottery ahead of time: Efficient early network pruning

J Rachwan, D Zügner, B Charpentier… - International …, 2022 - proceedings.mlr.press
Pruning, the task of sparsifying deep neural networks, received increasing attention recently.
Although state-of-the-art pruning methods extract highly sparse models, they neglect two …

Rare gems: Finding lottery tickets at initialization

K Sreenivasan, J Sohn, L Yang… - Advances in neural …, 2022 - proceedings.neurips.cc
Large neural networks can be pruned to a small fraction of their original size, with little loss
in accuracy, by following a time-consuming" train, prune, re-train" approach. Frankle & …

Damage classification of in-service steel railway bridges using a novel vibration-based convolutional neural network

A Ghiasi, MK Moghaddam, CT Ng, AH Sheikh… - Engineering …, 2022 - Elsevier
Railway bridges exposed to extreme environmental conditions can gradually lose their
effective cross-section at critical locations and cause catastrophic failure. This paper has …

Why random pruning is all we need to start sparse

AH Gadhikar, S Mukherjee… - … Conference on Machine …, 2023 - proceedings.mlr.press
Random masks define surprisingly effective sparse neural network models, as has been
shown empirically. The resulting sparse networks can often compete with dense …