In recent years, deep neural networks have been successful in both industry and academia, especially for computer vision tasks. The great success of deep learning is mainly due to its …
C He, M Annavaram… - Advances in Neural …, 2020 - proceedings.neurips.cc
Scaling up the convolutional neural network (CNN) size (eg, width, depth, etc.) is known to effectively improve model accuracy. However, the large model size impedes training on …
We propose ResRep, a novel method for lossless channel pruning (aka filter pruning), which slims down a CNN by reducing the width (number of output channels) of convolutional …
Y Liu, W Zhang, J Wang - Neurocomputing, 2020 - Elsevier
Abstract Knowledge distillation (KD) is an effective learning paradigm for improving the performance of lightweight student networks by utilizing additional supervision knowledge …
K Yang, X Hu, Y Fang, K Wang… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org
Modern efficient Convolutional Neural Networks (CNNs) are able to perform semantic segmentation both swiftly and accurately, which covers typically separate detection tasks …
Deep convoloutional networks have been widely deployed in modern cyber-physical systems performing different visual classification tasks. As the fog and edge devices have …
Text recognition remains a fundamental and extensively researched topic in computer vision, largely owing to its wide array of commercial applications. The challenging nature of …
C Pham, T Hoang, TT Do - Proceedings of the IEEE/CVF …, 2023 - openaccess.thecvf.com
Abstract Knowledge distillation which learns a lightweight student model by distilling knowledge from a cumbersome teacher model is an attractive approach for learning …