Knowledge distillation and student-teacher learning for visual intelligence: A review and new outlooks

L Wang, KJ Yoon - IEEE transactions on pattern analysis and …, 2021 - ieeexplore.ieee.org
Deep neural models, in recent years, have been successful in almost every field, even
solving the most complex problem statements. However, these models are huge in size with …

[PDF][PDF] 知识蒸馏研究综述

黄震华, 杨顺志, 林威, 倪娟, 孙圣力, 陈运文, 汤庸 - 计算机学报, 2022 - 159.226.43.17
摘要高性能的深度学习网络通常是计算型和参数密集型的, 难以应用于资源受限的边缘设备.
为了能够在低资源设备上运行深度学习模型, 需要研发高效的小规模网络 …

Knowledge distillation: A survey

J Gou, B Yu, SJ Maybank, D Tao - International Journal of Computer Vision, 2021 - Springer
In recent years, deep neural networks have been successful in both industry and academia,
especially for computer vision tasks. The great success of deep learning is mainly due to its …

Group knowledge transfer: Federated learning of large cnns at the edge

C He, M Annavaram… - Advances in Neural …, 2020 - proceedings.neurips.cc
Scaling up the convolutional neural network (CNN) size (eg, width, depth, etc.) is known to
effectively improve model accuracy. However, the large model size impedes training on …

Resrep: Lossless cnn pruning via decoupling remembering and forgetting

X Ding, T Hao, J Tan, J Liu, J Han… - Proceedings of the …, 2021 - openaccess.thecvf.com
We propose ResRep, a novel method for lossless channel pruning (aka filter pruning), which
slims down a CNN by reducing the width (number of output channels) of convolutional …

Adaptive multi-teacher multi-level knowledge distillation

Y Liu, W Zhang, J Wang - Neurocomputing, 2020 - Elsevier
Abstract Knowledge distillation (KD) is an effective learning paradigm for improving the
performance of lightweight student networks by utilizing additional supervision knowledge …

Omnisupervised omnidirectional semantic segmentation

K Yang, X Hu, Y Fang, K Wang… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org
Modern efficient Convolutional Neural Networks (CNNs) are able to perform semantic
segmentation both swiftly and accurately, which covers typically separate detection tasks …

Industrial cyber-physical systems-based cloud IoT edge for federated heterogeneous distillation

C Wang, G Yang, G Papanastasiou… - IEEE Transactions …, 2020 - ieeexplore.ieee.org
Deep convoloutional networks have been widely deployed in modern cyber-physical
systems performing different visual classification tasks. As the fog and edge devices have …

Text is text, no matter what: Unifying text recognition using knowledge distillation

AK Bhunia, A Sain, PN Chowdhury… - Proceedings of the …, 2021 - openaccess.thecvf.com
Text recognition remains a fundamental and extensively researched topic in computer
vision, largely owing to its wide array of commercial applications. The challenging nature of …

Collaborative multi-teacher knowledge distillation for learning low bit-width deep neural networks

C Pham, T Hoang, TT Do - Proceedings of the IEEE/CVF …, 2023 - openaccess.thecvf.com
Abstract Knowledge distillation which learns a lightweight student model by distilling
knowledge from a cumbersome teacher model is an attractive approach for learning …