The growing energy and performance costs of deep learning have driven the community to reduce the size of neural networks by selectively pruning components. Similarly to their …
U Evci, T Gale, J Menick, PS Castro… - … on machine learning, 2020 - proceedings.mlr.press
Many applications require sparse neural networks due to space or inference time restrictions. There is a large body of work on training dense networks to yield sparse …
Channel pruning is one of the predominant approaches for deep model compression. Existing pruning methods either train from scratch with sparsity constraints on channels, or …
The deployment of deep convolutional neural networks (CNNs) in many real world applications is largely hindered by their high computational cost. In this paper, we propose a …
We propose a practical method for $ L_0 $ norm regularization for neural networks: pruning the network during training by encouraging weights to become exactly zero. Such …
Given the increasing promise of graph neural networks (GNNs) in real-world applications, several methods have been developed for explaining their predictions. Existing methods for …
The growth of the Machine-Learning-As-A-Service (MLaaS) market has highlighted clients' data privacy and security issues. Private inference (PI) techniques using cryptographic …
Works on lottery ticket hypothesis (LTH) and single-shot network pruning (SNIP) have raised a lot of attention currently on post-training pruning (iterative magnitude pruning), and before …
S Liu, L Yin, DC Mocanu… - … on Machine Learning, 2021 - proceedings.mlr.press
In this paper, we introduce a new perspective on training deep neural networks capable of state-of-the-art performance without the need for the expensive over-parameterization by …