首页 | 本学科首页   官方微博 | 高级检索  
     

基于注意力机制的LSTM的语义关系抽取*
引用本文:王红,史金钏,张志伟.基于注意力机制的LSTM的语义关系抽取*[J].计算机应用研究,2018,35(5).
作者姓名:王红  史金钏  张志伟
作者单位:中国民航大学 计算机科学与技术学院,中国民航大学 计算机科学与技术学院,中国民航大学 计算机科学与技术学院
基金项目:国家自然科学基金(NO:U1633110、NO:U1533104、NO:U1233113)
摘    要:目前关系抽取方法中,传统深度学习方法存在长距离依赖问题,并且未考虑模型输入与输出的相关性。针对以上问题,提出了一种将LSTM模型(Long Short-term memory)与注意力机制(Attention Mechanism)相结合的关系抽取方法。首先将文本信息向量化,提取文本局部特征;然后将文本局部特征导入双向LSTM模型中,通过注意力机制对LSTM模型的输入与输出之间的相关性进行重要度计算,根据重要度获取文本整体特征;最后将局部特征和整体特征进行特征融合,通过分类器输出分类结果。在SemEval-2010 Task 8语料库上的实验结果表明,该方法的准确率和稳定性较传统深度学习方法有进一步提高,为自动问答、信息检索以及本体学习等领域提供了方法支持。

关 键 词:文本信息  语义关系  关系抽取  LSTM  注意力机制
收稿时间:2017/6/2 0:00:00
修稿时间:2018/3/16 0:00:00

Text semantic relation extraction of LSTM based on Attention
Wang Hong,Shi Jinchuan and Zhang Zhiwei.Text semantic relation extraction of LSTM based on Attention[J].Application Research of Computers,2018,35(5).
Authors:Wang Hong  Shi Jinchuan and Zhang Zhiwei
Affiliation:School of Computer Science and Technology,Civil Aviation University of China,,
Abstract:In the methods of relation extraction, the traditional deep learning method has the problem of long distance dependence and does not consider the correlation between input and output of the model. a new relation extraction model is proposed,which combining LSTM and Attention Mechanism .Firstly,the model embedding the text information and then obtain the local feature. Secondly,the local feature is introduced into the bidirectional LSTM model, and the attention mechanism is used to calculate the importance probability between the input and output of the LSTM model to obtain the global feature .Finally, the local feature and the global feature are fused, it obtain the result of relation extraction by classifier. Experiments were conducted on the SemEval-2010 Task 8 corpus, the results show that the accuracy and stability of the method have been further improved , which provides method support for automatic question answering, information retrieval and ontology learning .
Keywords:text information  semantic relation  relation extraction  LSTM  attention mechanism
点击此处可从《计算机应用研究》浏览原始摘要信息
点击此处可从《计算机应用研究》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号