Prompting to distill: Boosting data-free knowledge distillation via reinforced prompt

X Ma, X Wang, G Fang, Y Shen, W Lu - arXiv preprint arXiv:2205.07523, 2022 - arxiv.org
Data-free knowledge distillation (DFKD) conducts knowledge distillation via eliminating the
dependence of original training data, and has recently achieved impressive results in …

Prompting to Distill: Boosting Data-Free Knowledge Distillation via Reinforced Prompt

X Ma, X Wang, G Fang, Y Shen, W Lu - arXiv e-prints, 2022 - ui.adsabs.harvard.edu
Data-free knowledge distillation (DFKD) conducts knowledge distillation via eliminating the
dependence of original training data, and has recently achieved impressive results in …

[PDF][PDF] Prompting to Distill: Boosting Data-Free Knowledge Distillation via Reinforced Prompt

X Ma, X Wang, G Fang, Y Shen, W Lu - ijcai.org
Data-free knowledge distillation (DFKD) conducts knowledge distillation via eliminating the
dependence of original training data, and has recently achieved impressive results in …

[PDF][PDF] Prompting to Distill: Boosting Data-Free Knowledge Distillation via Reinforced Prompt

X Ma, X Wang, G Fang, Y Shen, W Lu - scholar.archive.org
Data-free knowledge distillation (DFKD) conducts knowledge distillation via eliminating the
dependence of original training data, and has recently achieved impressive results in …

[PDF][PDF] Prompting to Distill: Boosting Data-Free Knowledge Distillation via Reinforced Prompt

X Ma, X Wang, G Fang, Y Shen, W Lu - scholar.archive.org
Data-free knowledge distillation (DFKD) conducts knowledge distillation via eliminating the
dependence of original training data, and has recently achieved impressive results in …