Class attention transfer based knowledge distillation
Previous knowledge distillation methods have shown their impressive performance on
model compression tasks, however, it is hard to explain how the knowledge they transferred …
model compression tasks, however, it is hard to explain how the knowledge they transferred …
Class Attention Transfer Based Knowledge Distillation
Z Guo, H Yan, H Li, X Lin - 2023 IEEE/CVF Conference on Computer …, 2023 - computer.org
Previous knowledge distillation methods have shown their impressive performance on
model compression tasks, however, it is hard to explain how the knowledge they transferred …
model compression tasks, however, it is hard to explain how the knowledge they transferred …
Class Attention Transfer Based Knowledge Distillation
Z Guo, H Yan, H Li, X Lin - arXiv preprint arXiv:2304.12777, 2023 - arxiv.org
Previous knowledge distillation methods have shown their impressive performance on
model compression tasks, however, it is hard to explain how the knowledge they transferred …
model compression tasks, however, it is hard to explain how the knowledge they transferred …
Class Attention Transfer Based Knowledge Distillation
Z Guo, H Yan, H Li, X Lin - 2023 IEEE/CVF Conference on …, 2023 - ieeexplore.ieee.org
Previous knowledge distillation methods have shown their impressive performance on
model compression tasks, however, it is hard to explain how the knowledge they transferred …
model compression tasks, however, it is hard to explain how the knowledge they transferred …
Class Attention Transfer Based Knowledge Distillation
Z Guo, H Yan, H Li, X Lin - arXiv e-prints, 2023 - ui.adsabs.harvard.edu
Previous knowledge distillation methods have shown their impressive performance on
model compression tasks, however, it is hard to explain how the knowledge they transferred …
model compression tasks, however, it is hard to explain how the knowledge they transferred …