首页 | 本学科首页   官方微博 | 高级检索  
     

融合序列和图结构的机器阅读理解
引用本文:陈峥,任建坤,袁浩瑞.融合序列和图结构的机器阅读理解[J].中文信息学报,2021,35(4):120-128.
作者姓名:陈峥  任建坤  袁浩瑞
作者单位:电子科技大学 信息与软件工程学院,四川 成都 610054
基金项目:四川省科技计划项目(2020YFG0009)
摘    要:机器阅读理解是自然语言处理中的一项重要而富有挑战性的任务。近年来,以BERT为代表的大规模预训练语言模型在此领域取得了显著的成功。但是,受限于序列模型的结构和规模,基于BERT的阅读理解模型在长距离和全局语义构建的能力有着显著缺陷,影响了其在阅读理解任务上的表现。针对这一问题,该文提出一种融合了序列和图结构的机器阅读理解的新模型。首先,提取文本中的命名实体,使用句子共现和滑动窗口共现两种方案构建命名实体共现图;基于空间的图卷积神经网络学习命名实体的嵌入表示;将通过图结构得到的实体嵌入表示融合到基于序列结构得到的文本嵌入表示中;最终采用片段抽取的方式实现机器阅读理解问答。实验结果表明,与采用BERT所实现的基于序列结构的阅读理解模型相比,融合序列和图结构的机器阅读理解模型EM值提高了7.8%,F1值提高了6.6%。

关 键 词:机器阅读理解  图神经网络  深度学习  
收稿时间:2020-08-20

Machine Reading Comprehension Based on Sequence and Graph Structure
CHEN Zheng,REN Jiankun,YUAN Haorui.Machine Reading Comprehension Based on Sequence and Graph Structure[J].Journal of Chinese Information Processing,2021,35(4):120-128.
Authors:CHEN Zheng  REN Jiankun  YUAN Haorui
Affiliation:School of Information and Software Engineering, University of Electronic Science and Technology of China, Chengdu, Sichuan 610054, China
Abstract:Machine Reading Comprehension(MRC) is an essential and challenging task in Natural Language Processing(NLP). As the state-of-the-art solution, the BERT-based reading comprehension model, however, is defected in long-distance and global semantic construction owing to the structure and scale of the sequence model,. This paper proposes a new Machine Reading Comprehension method combining sequence and graph structure. First, named entities are extracted from the text, and sentence co-occurrence and sliding window co-occurrence are used to construct named entity co-occurrence diagram. Then a spatial-based Graph Convolutional Neural Network is designed to learn the embedded representation of the named entities. And the entity embedded representation obtained by graph structure is fused with the text embedded representation obtained by the sequence structure. In the end, the answer for the Machine Reading Comprehension question is decided by segment extraction. The experimental results show that, compared with the sequential structure-based reading comprehension model based on BERT, our model has achieved 7.8% improvements in EM, and 6.6% in F1.
Keywords:machine reading comprehension  graph neural network  deep learning  
点击此处可从《中文信息学报》浏览原始摘要信息
点击此处可从《中文信息学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号