2024-06-24
【學術亮點-頂級會議論文】基於 Pre-trained 語言模型之自動選項生成研究
字體大小
小
中
大
【學術亮點-頂級會議論文】基於 Pre-trained 語言模型之自動選項生成研究
AI core Technology: Advanced Research and Resource Integration Platform or AI Technology
【Department of Computer Science and Engineering / Yao-Chung Fan / Associate Professor】
核心技術:AI核心技術之進階研究與資源整合平台【資訊工程學系范耀中副教授】
上架日期:2022/12/7-12/11
AI core Technology: Advanced Research and Resource Integration Platform or AI Technology
【Department of Computer Science and Engineering / Yao-Chung Fan / Associate Professor】
核心技術:AI核心技術之進階研究與資源整合平台【資訊工程學系范耀中副教授】
論文篇名 | 英文:CDGP: Automatic Cloze Distractor Generation based on Pre-trained Language Model 中文:基於 Pre-trained 語言模型之自動選項生成研究 |
期刊名稱 | Findings of the Association for Computational Linguistics: EMNLP 2022 (指標清單期刊) |
發表年份, 卷數, 起迄頁數 | EMNLP 2022 , 2022, pp.5835-5840. International Conference on Empirical Methods in Natural Language Processing (EMNLP 2022) in Abu Dhabi, United Arab Emirates from December 7th to 11th, 2022 |
作者 | Shang-Hsuan Chiang; Ssu-Cheng Wang; Yao-Chung Fan(范耀中)∗ |
DOI | 10.18653/v1/2022.findings-emnlp.429 |
英文摘要 | Manually designing cloze test consumes enormous time and efforts. The major challenge lies in wrong option (distractor) selection. Having carefully-design distractors improves the effectiveness of learner ability assessment. As a result, the idea of automatically generating cloze distractor is motivated. In this paper, we investigate cloze distractor generation by exploring the employment of pre-trained language models (PLMs) as an alternative for candidate distractor generation. Experiments show that the PLM-enhanced model brings a substantial performance improvement. Our best performing model advances the state-of-the-art result from 14.94 to 34.17 (NDCG@10 score). Our code and dataset is available at this https URL. |