首页 | 官方网站   微博 | 高级检索  
     

融合语义结构的注意力文本摘要模型
引用本文:滕少华,董谱,张巍.融合语义结构的注意力文本摘要模型[J].广东工业大学学报,2021,38(3):1-8.
作者姓名:滕少华  董谱  张巍
作者单位:广东工业大学 计算机学院,广东 广州 510006
基金项目:国家自然科学基金资助项目(61972102);广东省重点领域研发计划项目(2020B010166006);广东省教育厅项目(粤教高函〔2018〕 179号,粤教高函〔2018〕 1号);广州市科技计划项目(201903010107,201802030011,201802010026,201802010042,201604046017)
摘    要:传统基于序列的文本摘要生成模型未考虑词的上下文语义信息, 导致生成的文本摘要准确性不高, 也不符合人类的语言习惯。本文提出了一种基于文本语义结构的文本摘要生成模型(Structure Based Attention Sequence to Sequence Model, SBA), 结合注意力机制的序列到序列生成模型, 引入文本的句法结构信息, 使得注意力结构得到的上下文向量同时包含文本语义信息与句法结构信息, 获得生成的文本摘要。最后, 基于Gigaword数据集的实验结果表明, 提出的方法能有效地提高生成摘要的准确性以及可读性。

关 键 词:文本摘要  序列到序列模型  注意力机制  语义结构  
收稿时间:2020-12-17

An Attention Text Summarization Model Based on Syntactic Structure Fusion
Teng Shao-hua,Dong Pu,Zhang Wei.An Attention Text Summarization Model Based on Syntactic Structure Fusion[J].Journal of Guangdong University of Technology,2021,38(3):1-8.
Authors:Teng Shao-hua  Dong Pu  Zhang Wei
Affiliation:School of Computers, Guangdong University of Technology, Guangzhou 510006, China
Abstract:The traditional text summarization generation model based on sequence does not consider the context semantic information of words, which leads to the low accuracy of the generated text summary and does not conform to the human language habits. In view of this inadequacy, a structure-based attention sequence to sequence model (SBA) is proposed, which combines the sequence to sequence generation model of attention mechanism and introduces the syntactic structure information of the text, so that the context vector obtained by the attention structure contains both the semantic information and the syntactic structure information of the text, so as to obtain the generated text abstract. The experimental results based on Gigaword dataset show that the proposed method can effectively improve the accuracy and readability of the generated summary.
Keywords:text summarization  sequence to sequence model  attention mechanism  syntactic structure  
点击此处可从《广东工业大学学报》浏览原始摘要信息
点击此处可从《广东工业大学学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号