首页 | 本学科首页   官方微博 | 高级检索  
     

基于注意力与神经图灵机的语义关系抽取模型
引用本文:张润岩,孟凡荣,周勇,刘兵. 基于注意力与神经图灵机的语义关系抽取模型[J]. 计算机应用, 2018, 38(7): 1831-1838. DOI: 10.11772/j.issn.1001-9081.2017123009
作者姓名:张润岩  孟凡荣  周勇  刘兵
作者单位:1. 中国矿业大学 计算机科学与技术学院, 江苏 徐州 221116;2. 中国科学院 电子研究所, 北京 100080
基金项目:国家自然科学基金面上项目(61572505)。
摘    要:针对语义关系抽取(语义关系分类)中长语句效果不佳和核心词表现力弱的问题,提出了一种基于词级注意力的双向神经图灵机(Ab-NTM)模型。首先,使用神经图灵机(NTM)作为循环神经网络(RNN)的改进,使用长短时记忆(LSTM)网络作为控制器,其互不干扰的存储特性可加强模型在长语句上的记忆能力;然后,构建注意力层组织词级上下文信息,使模型可以加强句中核心词的表现力;最后,输入分类器得到语义关系标签。在SemEval 2010 Task 8公共数据集上的实验表明,该模型获得了86.2%的得分,优于其他方法。

关 键 词:自然语言处理  语义关系抽取  循环神经网络  双向神经图灵机  注意力机制  
收稿时间:2017-12-22
修稿时间:2018-02-09

Semantic relation extraction model via attention based neural Turing machine
ZHANG Runyan,MENG Fanrong,ZHOU Yong,LIU Bing. Semantic relation extraction model via attention based neural Turing machine[J]. Journal of Computer Applications, 2018, 38(7): 1831-1838. DOI: 10.11772/j.issn.1001-9081.2017123009
Authors:ZHANG Runyan  MENG Fanrong  ZHOU Yong  LIU Bing
Affiliation:1. School of Computer Science and Technology, China University of Mining and Technology, Xuzhou Jiangsu 221116, China;2. Institute of Electrics, Chinese Academy of Sciences, Beijing 100080, China
Abstract:Focusing on the problem of poor memory in long sentences and the lack of core words' influence in semantic relation extraction, an Attention based bidirectional Neural Turing Machine (Ab-NTM) model was proposed. Instead of a Recurrent Neural Network (RNN), a Neural Turing Machine (NTM) was used firstly, and a Long Short-Term Memory (LSTM) network was acted as a controller, which contained larger and non-interfering storage, and it could hold longer memories than the RNN. Secondly, an attention layer was used to organize the context information on the word level so that the model could pay attention to the core words in sentences. Finally, the labels were gotten through the classifier. Experiments on the SemEval-2010 Task 8 dataset show that the proposed model outperforms most state-of-the-art methods with an 86.2% F1-score.
Keywords:Natural Language Processing (NLP)  semantic relation extraction  Recurrent Neural Network (RNN)  bidirectional Neural Turing Machine (NTM)  attention mechanism  
点击此处可从《计算机应用》浏览原始摘要信息
点击此处可从《计算机应用》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号