X Ma, X Wang, G Fang, Y Shen, W Lu - arXiv e-prints, 2022 - ui.adsabs.harvard.edu
Data-free knowledge distillation (DFKD) conducts knowledge distillation via eliminating the
dependence of original training data, and has recently achieved impressive results in …