Hierarchical multi-attention transfer for knowledge distillation

J Gou, L Sun, B Yu, S Wan, D Tao - ACM Transactions on Multimedia …, 2023 - dl.acm.org
hierarchical multi-attention knowledge transfer by utilizing each attention knowledge with
the adaptively learned weight to be distilled at different layers during the distillation training …

Knowledge transfer for deep reinforcement learning with hierarchical experience replay

H Yin, S Pan - Proceedings of the AAAI Conference on Artificial …, 2017 - ojs.aaai.org
… The approach that utilizes distillation technique to conduct knowledge transfer for multi-task …
tolerance towards negative transfer. Second, we propose hierarchical prioritized experience …

Hierarchical self-supervised augmented knowledge distillation

C Yang, Z An, L Cai, Y Xu - arXiv preprint arXiv:2107.13715, 2021 - arxiv.org
… distillation framework by leveraging the architectural auxiliary classifiers, facilitating
comprehensive knowledge transfer and alleviating the mismatch problem of abstraction levels when …

Hierarchical visual-textual knowledge distillation for life-long correlation learning

Y Peng, J Qi, Z Ye, Y Zhuo - International Journal of Computer Vision, 2021 - Springer
… The visual-textual hierarchical … the knowledge at a high level of the hierarchical network.
We further propose semantic-level knowledge distillation, attention-level knowledge transfer

Hierarchical distillation learning for scalable person search

W Li, S Gong, X Zhu - Pattern Recognition, 2021 - Elsevier
… a Hierarchical Distillation Learning approach for more discriminating knowledge transfer
largely facilitates the knowledge distillation by avoiding knowledge transfer between structure …

Adaptive hierarchy-branch fusion for online knowledge distillation

L Gong, S Lin, B Zhang, Y Shen, K Li, R Qiao… - Proceedings of the …, 2023 - ojs.aaai.org
… an adaptive hierarchy-branch fusion module to create hierarchical teacher assistants …
hierarchical teacher assistants, we construct two kinds of distillation losses for knowledge transfer

Knowledge distillation using hierarchical self-supervision augmented distribution

C Yang, Z An, L Cai, Y Xu - IEEE Transactions on Neural …, 2022 - ieeexplore.ieee.org
knowledge gap. It facilitates comprehensive knowledge transfer from hierarchical feature
maps. … success of the self-supervision augmented task, we consider distilling self-supervision …

A hierarchical knowledge transfer framework for heterogeneous federated learning

Y Deng, J Ren, C Tang, F Lyu, Y Liu… - IEEE INFOCOM 2023 …, 2023 - ieeexplore.ieee.org
… ensemble distillation scheme with serverassisted knowledge selection, … knowledge transfer
mechanism and a weighted ensemble distillation scheme with serverassisted knowledge

Variational information distillation for knowledge transfer

S Ahn, SX Hu, A Damianou… - Proceedings of the …, 2019 - openaccess.thecvf.com
… framework for knowledge transfer which formulates knowledge transfer as maximizing the …
with existing knowledge transfer methods on both knowledge distillation and transfer learning …

Transferable and differentiable discrete network embedding for multi-domains with hierarchical knowledge distillation

T He, L Gao, J Song, YF Li - Information Sciences, 2023 - Elsevier
… We focus on solving two problems: knowledge transfer from a labeled domain to a new … we
propose a hierarchical knowledge distillation strategy to mitigate the knowledge gap between …