Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks

T Hoefler, D Alistarh, T Ben-Nun, N Dryden… - Journal of Machine …, 2021 - jmlr.org
The growing energy and performance costs of deep learning have driven the community to
reduce the size of neural networks by selectively pruning components. Similarly to their …

Nisp: Pruning networks using neuron importance score propagation

R Yu, A Li, CF Chen, JH Lai… - Proceedings of the …, 2018 - openaccess.thecvf.com
To reduce the significant redundancy in deep Convolutional Neural Networks (CNNs), most
existing methods prune neurons by only considering the statistics of an individual layer or …

Data-driven sparse structure selection for deep neural networks

Z Huang, N Wang - Proceedings of the European …, 2018 - openaccess.thecvf.com
Deep convolutional neural networks have liberated its extraordinary power on various tasks.
However, it is still very challenging to deploy state-of-the-art models into real-world …

Online deep learning: Learning deep neural networks on the fly

D Sahoo, Q Pham, J Lu, SCH Hoi - arXiv preprint arXiv:1711.03705, 2017 - arxiv.org
Deep Neural Networks (DNNs) are typically trained by backpropagation in a batch learning
setting, which requires the entire training data to be made available prior to the learning …

Training sparse neural networks

S Srinivas, A Subramanya… - Proceedings of the …, 2017 - openaccess.thecvf.com
The emergence of Deep neural networks has seen human-level performance on large scale
computer vision tasks such as image classification. However these deep networks typically …

PAC-Bayesian framework based drop-path method for 2D discriminative convolutional network pruning

Q Zheng, X Tian, M Yang, Y Wu, H Su - Multidimensional Systems and …, 2020 - Springer
Deep convolutional neural networks (CNNs) have demonstrated its extraordinary power on
various visual tasks like object detection and classification. However, it is still challenging to …

DNN surrogates for turbulence closure in CFD-based shape optimization

MG Kontou, VG Asouti, KC Giannakoglou - Applied Soft Computing, 2023 - Elsevier
A DNN-based surrogate for turbulence (and transition) closure of the Reynolds-Averaged
Navier–Stokes (RANS) equations is presented. The DNN configuration, namely the …

Channel selection using gumbel softmax

C Herrmann, RS Bowen, R Zabih - European conference on computer …, 2020 - Springer
Important applications such as mobile computing require reducing the computational costs
of neural network inference. Ideally, applications would specify their preferred tradeoff …

Multi-objective pruning for cnns using genetic algorithm

C Yang, Z An, C Li, B Diao, Y Xu - … 17–19, 2019, Proceedings, Part II 28, 2019 - Springer
In this work, we propose a heuristic genetic algorithm (GA) for pruning convolutional neural
networks (CNNs) according to the multi-objective trade-off among error, computation and …

Generalized dropout

S Srinivas, RV Babu - arXiv preprint arXiv:1611.06791, 2016 - arxiv.org
Deep Neural Networks often require good regularizers to generalize well. Dropout is one
such regularizer that is widely used among Deep Learning practitioners. Recent work has …