Multi-target knowledge distillation via student self-reflection

J Gou, X Xiong, B Yu, L Du, Y Zhan, D Tao - International Journal of …, 2023 - Springer
Abstract Knowledge distillation is a simple yet effective technique for deep model
compression, which aims to transfer the knowledge learned by a large teacher model to a …

Multi-target Knowledge Distillation via Student Self-reflection

G Jianping, X Xiangshuo, Y Baosheng - 2023 - dlib.phenikaa-uni.edu.vn
Knowledge distillation is a simple yet effective technique for deep model compression,
which aims to transfer the knowledge learned by a large teacher model to a small student …