关注
Xinge Ma
Xinge Ma
在 mail.ynu.edu.cn 的电子邮件经过验证
标题
引用次数
引用次数
年份
Knowledge distillation with reptile meta-learning for pretrained language model compression
X Ma, J Wang, LC Yu, X Zhang
Proceedings of the 29th International Conference on Computational …, 2022
82022
YNU-HPCC at SemEval-2021 task 11: using a BERT model to extract contributions from NLP scholarly articles
X Ma, J Wang, X Zhang
Proceedings of the 15th International Workshop on Semantic Evaluation …, 2021
82021
FedID: Federated Interactive Distillation for Large-Scale Pretraining Language Models
X Ma, J Liu, J Wang, X Zhang
Proceedings of the 2023 Conference on Empirical Methods in Natural Language …, 2023
12023
系统目前无法执行此操作,请稍后再试。
文章 1–3