G Yang, S Yu, Y Sheng, H Yang - Scientific Reports, 2023 - nature.com
Existing knowledge distillation (KD) methods are mainly based on features, logic, or
attention, where features and logic represent the results of reasoning at different stages of a …