首页 | 本学科首页   官方微博 | 高级检索  
     

采用自注意力机制和CNN融合的实体关系抽取
引用本文:闫雄,段跃兴,张泽华.采用自注意力机制和CNN融合的实体关系抽取[J].计算机工程与科学,2020,42(11):2059-2066.
作者姓名:闫雄  段跃兴  张泽华
作者单位:(太原理工大学信息与计算机学院,山西 晋中030600)
摘    要:

关 键 词:实体关系抽取  自注意力机制  卷积神经网络  词向量  上下文语义  
收稿时间:2019-09-26
修稿时间:2020-03-19

Entity relationship extraction fusingself-attention mechanism and CNN
YAN Xiong,DUAN Yue xing,ZHANG Ze hua.Entity relationship extraction fusingself-attention mechanism and CNN[J].Computer Engineering & Science,2020,42(11):2059-2066.
Authors:YAN Xiong  DUAN Yue xing  ZHANG Ze hua
Affiliation:(College of Information and Computer,Taiyuan University of Technology,Jinzhong 030600,China)
Abstract:At present, the neural network model plays an important role in entity relationship extraction tasks. Features can be automatically extracted by a convolutional neural network, but it is limited because a fixed window size convolution kernel in a convolutional neural network is used to extract contextual semantic information of words in a sentence. Therefore, this paper proposes a new relational extraction method fusing self attention and convolutional neural network. The original word vector is calculated by the self attention mechanism to obtain the relationship between the words in the sequence. The input word vector expresses richer semantic information, which can make up for the deficiency of the automatic extraction features of the convolutional neural network. The experimental results on the SemEval 2010 Task 8 dataset show that, after adding the self attention mechanism, our model is beneficial to improve the entity relationship extraction effect.
Keywords:entity relationship extraction  self attention mechanism  convolutional neural network  word vector  context semantic  
本文献已被 万方数据 等数据库收录!
点击此处可从《计算机工程与科学》浏览原始摘要信息
点击此处可从《计算机工程与科学》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号