H Wang, C Qin, Y Bai, Y Zhang, Y Fu - arXiv preprint arXiv:2103.06460, 2021 - arxiv.org
Neural network pruning typically removes connections or neurons from a pretrained converged model; while a new pruning paradigm, pruning at initialization (PaI), attempts to …
Abstract Recently, Vision Transformer (ViT) has continuously established new milestones in the computer vision field, while the high computation and memory cost makes its …
Recently, a new trend of exploring sparsity for accelerating neural network training has emerged, embracing the paradigm of training on the edge. This paper proposes a novel …
Neural network quantization is a promising compression technique to reduce memory footprint and save energy consumption, potentially leading to real-time inference. However …
R Burkholz - Advances in Neural Information Processing …, 2022 - proceedings.neurips.cc
The strong lottery ticket hypothesis has highlighted the potential for training deep neural networks by pruning, which has inspired interesting practical and theoretical insights into …
R Burkholz - International Conference on Machine Learning, 2022 - proceedings.mlr.press
Abstract The Lottery Ticket Hypothesis continues to have a profound practical impact on the quest for small scale deep neural networks that solve modern deep learning tasks at …
Deep neural networks (DNNs) are effective in solving many real-world problems. Larger DNN models usually exhibit better quality (eg, accuracy) but their excessive computation …
The lottery ticket hypothesis has sparked the rapid development of pruning algorithms that aim to reduce the computational costs associated with deep learning during training and …