首页 | 本学科首页   官方微博 | 高级检索  
     

融入句法感知表示进行句法增强的语义解析
引用本文:谢德峰,吉建民.融入句法感知表示进行句法增强的语义解析[J].计算机应用,2021,41(9):2489-2495.
作者姓名:谢德峰  吉建民
作者单位:中国科学技术大学 计算机科学与技术学院, 合肥 230027
基金项目:科技创新2030—“新一代人工智能”重大项目(2018AA000500);广东省科技计划项目(2017B010110011)。
摘    要:在自然语言处理(NLP)中,句法信息是完整句子中词汇与词汇之间的句法结构关系或者依存关系,是一种重要且有效的参考信息。语义解析任务是将自然语言语句直接转化成语义完整的、计算机可执行的语言。在以往的语义解析研究中,少有采用输入源的句法信息来提高端到端语义解析效率的工作。为了进一步提高端到端语义解析模型的准确率和效率,提出一种利用输入端句法依存关系信息来提高模型效率的语义解析方法。该方法的基本思路是先对一个端到端的依存关系解析器进行预训练;然后将该解析器的中间表示作为句法感知表示,与原有的字词嵌入表示拼接到一起以产生新的输入嵌入表示,并将得到的输入嵌入表示用于端到端语义解析模型;最后采用转导融合学习方式进行模型融合。实验对比了所提模型和基准模型Transformer以及过去十年的相关工作。实验结果表明,在ATIS、GEO、JOBS数据集上,融入依存句法信息感知表示以及转导融合学习的语义解析模型分别实现了89.1%、90.7%、91.4%的最佳准确率,全面超过了Transformer,验证了引入句法依存关系信息的有效性。

关 键 词:句法感知  语义解析  深度学习  自然语言处理  语言模型  
收稿时间:2020-11-27
修稿时间:2021-01-15

Syntax-enhanced semantic parsing with syntax-aware representation
XIE Defeng,JI Jianmin.Syntax-enhanced semantic parsing with syntax-aware representation[J].journal of Computer Applications,2021,41(9):2489-2495.
Authors:XIE Defeng  JI Jianmin
Affiliation:School of Computer Science and Technology, University of Science and Technology of China, Hefei Anhui 230027, China
Abstract:Syntactic information, which is syntactic structure relations or dependency relations between words of a complete sentence, is an important and effective reference in Natural Language Processing (NLP). The task of semantic parsing is to directly transform natural language sentences into semantically complete and computer-executable languages. In previous semantic parsing studies, there are few efforts on improving the efficiency of end-to-end semantic parsing by using syntactic information from input sources. To further improve the accuracy and efficiency of the end-to-end semantic parsing model, a semantic parsing method was proposed to utilize the source-side dependency relation information of syntax to improve the model efficiency. As the basic idea of the method, an end-to-end dependency relation parser was pre-trained firstly. Then, the middle representation of the parser was used as syntax-aware representation, which was spliced with the original word embedding representation to generate a new input embedding representation, and this obtained input embedding representation was used in the end-to-end semantic parsing model. Finally, the model fusion was carried out by the transductive fusion learning. In the experiments, the proposed model was compared with the baseline model Transformer and the related works in the past decade. Experimental results show that, on ATIS, GEO and JOBS datasets, the semantic parsing model integrating dependency syntax-aware representation and transductive fusion learning achieves the best accuracy of 89.1%, 90.7%, and 91.4% respectively, which exceeds the performance of the Transformer. It verifies the effectiveness of introducing the dependency relation information of syntax.
Keywords:syntax-aware  semantic parsing  deep learning  Natural Language Processing (NLP)  language model  
本文献已被 万方数据 等数据库收录!
点击此处可从《计算机应用》浏览原始摘要信息
点击此处可从《计算机应用》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号