首页 | 本学科首页   官方微博 | 高级检索  
     

面向问题生成的预训练模型适应性优化方法研究
引用本文:苏玉兰,洪宇,朱鸿雨,武恺莉,张民. 面向问题生成的预训练模型适应性优化方法研究[J]. 中文信息学报, 2022, 36(3): 91-100
作者姓名:苏玉兰  洪宇  朱鸿雨  武恺莉  张民
作者单位:苏州大学 计算机科学与技术学院,江苏 苏州 215006
基金项目:国家自然科学基金(62076174);江苏省研究生科研与实践创新计划项目(SJCX20_1064)
摘    要:
问题生成的核心任务是"在给定上下文语境的前提下,对目标答案自动生成相应的疑问句".问题生成是自然语言处理领域中富有挑战性的任务之一,其对可靠的语义编码和解码技术有着极高的要求.目前,预训练语言模型已在不同自然语言处理任务中得到广泛应用,并取得了较好的应用效果.该文继承这一趋势,尝试将预训练语言模型UNILM应用于现有"...

关 键 词:问题生成  暴露偏差  问答数据集  迁移学习

Adaptive Optimization Method of Pre-trained Language Model for Question Generation
SU Yulan,HONG Yu,ZHU Hongyu,WU Kaili,ZHANG Min. Adaptive Optimization Method of Pre-trained Language Model for Question Generation[J]. Journal of Chinese Information Processing, 2022, 36(3): 91-100
Authors:SU Yulan  HONG Yu  ZHU Hongyu  WU Kaili  ZHANG Min
Affiliation:School of Computer Science and Technology, Soochow University, Suzhou, Jiangsu 215006, China
Abstract:
Automatically question generation (QG for short) is to automatically generate the corresponding interrogative sentence of the target answer under the given context. . In this paper, we take advantage of pre-trained language model and apply the UNILM on encoder-decoder framework of question generation. In particular, in order to solve the problems of "exposure bias" and "mask heterogeneity" in the decoding phase of model, we examine the noise-aware training method and transfer learning on UNILM to raise its adaptability Experiments on SQuAD show that our best model yields state-of-the-art performance in answer-aware QG task with up to 20.31% and 21.95% BLEU score for split1 and split2, respectively, and in answer-agnostic QG task with 17.90% BLEU score for split1.
Keywords:question generation    exposure bias    question-answering dataset    transfer learning  
点击此处可从《中文信息学报》浏览原始摘要信息
点击此处可从《中文信息学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号