首页 | 本学科首页   官方微博 | 高级检索  
     

用于短文本情感分类的多头注意力记忆网络
引用本文:邓钰,李晓瑜,崔建,刘齐.用于短文本情感分类的多头注意力记忆网络[J].计算机应用,2021,41(11):3132-3138.
作者姓名:邓钰  李晓瑜  崔建  刘齐
作者单位:电子科技大学 信息与软件工程学院,成都 610054
解放军93246部队,长春 130000
解放军95486部队,成都 610041
基金项目:四川省科技计划项目(重点研发项目)(19ZDYF0794)
摘    要:随着社交网络的发展,对其包含的海量文本进行情感分析具有重要的社会价值。不同于普通文本分类,短文本情感分类需要挖掘隐含的情感语义特征,具有极大的难度和挑战性。为了能在更高的层次上得到短文本的情感语义特征,提出了一种多头注意力记忆网络(MAMN)用于短文本情感分类。首先,利用n元语法特征信息和有序神经元长短时记忆(ON-LSTM)网络对多头自注意力机制进行改进,以对文本上下文内联关系进行充分提取,使模型可以获得更丰富的文本特征信息。然后,利用多头注意力机制对多跳记忆网络的结构进行优化,使得在拓展模型深度的同时,挖掘更高层次的上下文内联情感语义关系。在电影评论集(MR)、斯坦福情感树(SST)-1和SST-2这三个不同的数据集上进行了大量实验。实验结果表明,与基于循环神经网络(RNN)和卷积神经网络(CNN)结构的基线模型以及一些最新成果相比,所提MAMN取得了较优的分类效果,验证了多跳结构对于性能改善的重要作用。

关 键 词:短文本  情感分类  情感语义特征  多头注意力  记忆网络  
收稿时间:2021-01-11
修稿时间:2021-03-03

Multi-head attention memory network for short text sentiment classification
DENG Yu,LI Xiaoyu,CUI Jian,LIU Qi.Multi-head attention memory network for short text sentiment classification[J].journal of Computer Applications,2021,41(11):3132-3138.
Authors:DENG Yu  LI Xiaoyu  CUI Jian  LIU Qi
Affiliation:School of Information and Software Engineering,University of Electronic Science and Technology of China,Chengdu Sichuan 610054,China
Unit 93246 of PLA,Changchun Jilin 130000,China
Unit 95486 of PLA,Chengdu Sichuan 610041,China
Abstract:With the development of social networks, it has important social value to analyze the sentiments of massive texts in the social networks. Different from ordinary text classification, short text sentiment classification needs to mine the implicit sentiment semantic features, so it is very difficult and challenging. In order to obtain short text sentiment semantic features at a higher level, a new Multi-head Attention Memory Network (MAMN) was proposed for sentiment classification of short texts. Firstly, n-gram feature information and Ordered Neurons Long Short-Term Memory (ON-LSTM) network were used to improve the multi-head self-attention mechanism to fully extract the internal relationship of the text context, so that the model was able obtain richer text feature information. Secondly, multi-head attention mechanism was adopted to optimize the multi-hop memory network structure, so as to expand the depth of the model and mine higher level contextual internal semantic relations at the same time. A large number of experiments were carried out on Movie Review dataset (MR), Stanford Sentiment Treebank (SST)-1 and SST-2 datasets. The experimental results show that compared with the baseline models based on Recurrent Neural Network (RNN) and Convolutional Neural Network (CNN) structure and some latest works, the proposed MAMN achieves the better classification results, and the importance of multi-hop structure in performance improvement is verified.
Keywords:short text  sentiment classification  sentiment semantic feature  multi-head attention  memory network  
本文献已被 万方数据 等数据库收录!
点击此处可从《计算机应用》浏览原始摘要信息
点击此处可从《计算机应用》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号