首页 | 本学科首页   官方微博 | 高级检索  
     


Neural Attentional Relation Extraction with Dual Dependency Trees
Authors:Dong Li  Zhi-Lei Lei  Bao-Yan Song  Wan-Ting Ji  Yue Kou
Affiliation:1.School of Information, Liaoning University, Shenyang 110036, China;2.School of Computer Science and Engineering, Northeastern University, Shenyang 110004, China
Abstract:Relation extraction has been widely used to find semantic relations between entities from plain text. Dependency trees provide deeper semantic information for relation extraction. However, existing dependency tree based models adopt pruning strategies that are too aggressive or conservative, leading to insufficient semantic information or excessive noise in relation extraction models. To overcome this issue, we propose the Neural Attentional Relation Extraction Model with Dual Dependency Trees (called DDT-REM), which takes advantage of both the syntactic dependency tree and the semantic dependency tree to well capture syntactic features and semantic features, respectively. Specifically, we first propose novel representation learning to capture the dependency relations from both syntax and semantics. Second, for the syntactic dependency tree, we propose a local-global attention mechanism to solve semantic deficits. We design an extension of graph convolutional networks (GCNs) to perform relation extraction, which effectively improves the extraction accuracy. We conduct experimental studies based on three real-world datasets. Compared with the traditional methods, our method improves the F 1 scores by 0.3, 0.1 and 1.6 on three real-world datasets, respectively.
Keywords:relation extraction  graph convolutional network (GCN)  syntactic dependency tree  semantic dependency tree  
点击此处可从《计算机科学技术学报》浏览原始摘要信息
点击此处可从《计算机科学技术学报》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号