首页 | 本学科首页   官方微博 | 高级检索  
     

全局自匹配机制的短文本摘要生成方法
引用本文:吴仁守,王红玲,王中卿,周国栋.全局自匹配机制的短文本摘要生成方法[J].软件学报,2019,30(9):2705-2717.
作者姓名:吴仁守  王红玲  王中卿  周国栋
作者单位:苏州大学 计算机科学与技术学院, 江苏 苏州 215006,苏州大学 计算机科学与技术学院, 江苏 苏州 215006,苏州大学 计算机科学与技术学院, 江苏 苏州 215006,苏州大学 计算机科学与技术学院, 江苏 苏州 215006
基金项目:国家自然科学基金(61806137);江苏省高等学校自然科学研究(18KJB520043);江苏高校优势学科建设工程
摘    要:基于编码器-解码器架构的序列到序列学习模型是近年来主流的生成式自动文摘模型,其在计算每一个词的隐层表示时,通常仅考虑该词之前(或之后)的一些词,无法获取全局信息,从而进行全局优化.针对这个问题,在编码器端引入全局自匹配机制进行全局优化,并利用全局门控单元抽取出文本的核心内容.全局自匹配机制根据文本中每个单词语义和文本整体语义的匹配程度,动态地从整篇文本中为文中每一个词收集与该词相关的信息,并进一步将该词及其匹配的信息有效编码到最终的隐层表示中,以获得包含全局信息的隐层表示.同时,考虑到为每一个词融入全局信息可能会造成冗余,引入了全局门控单元,根据自匹配层获得的全局信息对流入解码端的信息流进行过滤,筛选出原文本的核心内容.实验结果显示,与目前主流的生成式文摘方法相比,该方法在Rouge评价上有显著提高,这表明所提出的模型能有效融合全局信息,挖掘出原文本的核心内容.

关 键 词:自匹配机制  全局信息  神经网络  自动文摘  自然语言生成
收稿时间:2019/1/7 0:00:00
修稿时间:2019/3/2 0:00:00

Short Text Summary Generation with Global Self-matching Mechanism
WU Ren-Shou,WANG Hong-Ling,WANG Zhong-Qing and ZHOU Guo-Dong.Short Text Summary Generation with Global Self-matching Mechanism[J].Journal of Software,2019,30(9):2705-2717.
Authors:WU Ren-Shou  WANG Hong-Ling  WANG Zhong-Qing and ZHOU Guo-Dong
Affiliation:School of Computer Science and Technology, Soochow University, Suzhou 215006, China,School of Computer Science and Technology, Soochow University, Suzhou 215006, China,School of Computer Science and Technology, Soochow University, Suzhou 215006, China and School of Computer Science and Technology, Soochow University, Suzhou 215006, China
Abstract:In recent years, the sequence-to-sequence learning model with the encoder-decoder architecture has become the mainstream summarization generation approach. Currently, the model usually only considers limited words before (or after) when calculating the hidden layer state of a word, but can not obtain global information, so as to optimize the global situation. In order to address above challenges, this study introduces a global self-matching mechanism to optimize the encoder globally, and proposes a global gating unit to extract the core content of the text. The global self-matching mechanism dynamically collects relevant information from the entire input text for each word in the text according to the matching degree of each word semantics and the overall semantics of the text, and then effectively encodes the word and its matching information into the final hidden layer representation to obtain the hidden layer representation containing the global information. Meanwhile, considering that integrating global information into each word may cause redundancy, this study introduces a global gating unit, filters the information flow into the decoder according to the global information obtained from the self-matching layer, and filters out the core content of the source text. Experimented result shows that the proposed model has a significant improvement in the Rouge evaluation over the state-of-the art method.
Keywords:self-matching mechanism  global information  neural networks  automatic text summarization  natural language generation
点击此处可从《软件学报》浏览原始摘要信息
点击此处可从《软件学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号