首页 | 本学科首页   官方微博 | 高级检索  
     

结合一维扩展卷积与Attention机制的NLP模型
引用本文:廖文雄,曾碧,徐雅芸. 结合一维扩展卷积与Attention机制的NLP模型[J]. 计算机工程与应用, 2021, 57(4): 114-119. DOI: 10.3778/j.issn.1002-8331.1912-0057
作者姓名:廖文雄  曾碧  徐雅芸
作者单位:广东工业大学 计算机学院,广州 510006
基金项目:国家自然科学基金;广东省产学研重大专项项目
摘    要:自然语言处理作为人工智能的一个分支,在日常生活中有着广泛的应用.随着循环神经网络在自然语言处理领域的应用以及循环神经网络的不断演进与迭代,自然语言处理有了很大的飞跃.循环神经网络也因此迅速成为自然语言处理领域的主流算法,但是其具有结构复杂和训练时间漫长的缺点.提出一种基于一维扩展卷积和Attention机制的自然语言处...

关 键 词:扩展卷积  Attention机制  自然语言处理

Natural Language Processing Model Based on One-Dimensional Dilated Convolution and Attention Mechanism
LIAO Wenxiong,ZENG Bi,XU Yayun. Natural Language Processing Model Based on One-Dimensional Dilated Convolution and Attention Mechanism[J]. Computer Engineering and Applications, 2021, 57(4): 114-119. DOI: 10.3778/j.issn.1002-8331.1912-0057
Authors:LIAO Wenxiong  ZENG Bi  XU Yayun
Affiliation:School of Computers, Guangdong University of Technology, Guangzhou 510006, China
Abstract:Natural language processing, as a branch of artificial intelligence, has a wide range of applications in daily life. With the application of recurrent neural networks in the field of natural language processing and the continuous evolution and iteration of recurrent neural networks, natural language processing has made a great leap. As a result, recurrent neural networks have quickly become mainstream algorithms in the field of natural language processing, but they have the disadvantages of complex structure and long training time. This paper proposes a natural language processing model based on one-dimensional dilated convolution and Attention mechanism. Firstly, one-dimensional dilated convolution is used to extract the deep features of linguistic text, and then the deep features are assigned weights through the Attention mechanism to integrate various temporal features. The experimental results show that the training time of the model only needs about 30% of the recurrent neural network, and the performance similar to the recurrent neural network can be achieved, which verifies the effectiveness of the proposed model.
Keywords:dilated convolution  Attention mechanism  natural language processing  
本文献已被 万方数据 等数据库收录!
点击此处可从《计算机工程与应用》浏览原始摘要信息
点击此处可从《计算机工程与应用》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号