Cdgp: Automatic cloze distractor generation based on pre-trained language model

SH Chiang, SC Wang, YC Fan - arXiv preprint arXiv:2403.10326, 2024 - arxiv.org
SH Chiang, SC Wang, YC Fan
arXiv preprint arXiv:2403.10326, 2024arxiv.org
Manually designing cloze test consumes enormous time and efforts. The major challenge
lies in wrong option (distractor) selection. Having carefully-design distractors improves the
effectiveness of learner ability assessment. As a result, the idea of automatically generating
cloze distractor is motivated. In this paper, we investigate cloze distractor generation by
exploring the employment of pre-trained language models (PLMs) as an alternative for
candidate distractor generation. Experiments show that the PLM-enhanced model brings a …
Manually designing cloze test consumes enormous time and efforts. The major challenge lies in wrong option (distractor) selection. Having carefully-design distractors improves the effectiveness of learner ability assessment. As a result, the idea of automatically generating cloze distractor is motivated. In this paper, we investigate cloze distractor generation by exploring the employment of pre-trained language models (PLMs) as an alternative for candidate distractor generation. Experiments show that the PLM-enhanced model brings a substantial performance improvement. Our best performing model advances the state-of-the-art result from 14.94 to 34.17 (NDCG@10 score). Our code and dataset is available at https://github.com/AndyChiangSH/CDGP.
arxiv.org
以上显示的是最相近的搜索结果。 查看全部搜索结果