R Mishra, H Gupta - ACM Computing Surveys, 2023 - dl.acm.org
Deep Neural Networks (DNNs) have gained unprecedented popularity due to their high- order performance and automated feature extraction capability. This has encouraged …
P Hu, X Peng, H Zhu, MMS Aly, J Lin - Proceedings of the AAAI …, 2021 - ojs.aaai.org
Abstract As Deep Neural Networks (DNNs) usually are overparameterized and have millions of weight parameters, it is challenging to deploy these large DNN models on resource …
Today's deep neural networks (DNNs) are becoming deeper and wider because of increasing demand on the analysis quality and more and more complex applications to …
Model compression methods have become popular in recent years, which aim to alleviate the heavy load of deep neural networks (DNNs) in real-world applications. However, most of …
Deep networks often possess a vast number of parameters, and their significant redundancy in parameterization has become a widely-recognized property. This presents significant …
H Yang, S Gui, Y Zhu, J Liu - Proceedings of the IEEE/CVF …, 2020 - openaccess.thecvf.com
Abstract Deep Neural Networks (DNNs) are applied in a wide range of usecases. There is an increased demand for deploying DNNs on devices that do not have abundant resources …
The success of deep learning in numerous application domains created the de-sire to run and train them on mobile devices. This however, conflicts with their computationally, memory …
JO Neill - arXiv preprint arXiv:2006.03669, 2020 - arxiv.org
Overparameterized networks trained to convergence have shown impressive performance in domains such as computer vision and natural language processing. Pushing state of the …
Layer-wise magnitude-based pruning (LMP) is a very popular method for deep neural network (DNN) compression. However, tuning the layerspecific thresholds is a difficult task …