首页 | 官方网站   微博 | 高级检索  
     

RoBERTa融合BiLSTM及注意力机制的隐式情感分析
引用本文:张军,张丽,沈凡凡,谭海,何炎祥.RoBERTa融合BiLSTM及注意力机制的隐式情感分析[J].计算机工程与应用,2022,58(23):142-150.
作者姓名:张军  张丽  沈凡凡  谭海  何炎祥
作者单位:1.东华理工大学 信息工程学院,南昌 330013 2.东华理工大学 江西省放射性地学大数据技术工程实验室,南昌 330013 3.武汉大学 计算机学院,武汉 430072 4.南京审计大学 计算机学院,南京 211815
摘    要:隐式情感分析是自然语言处理的研究热点之一,由于其表达隐晦且缺少显示情感词,使得传统的文本情感分析方法不再适用。针对隐式情感分析中句子语义的隐藏情感捕捉困难问题,提出了基于RoBERTa融合双向长短期记忆网络及注意力机制的RBLA模型。该模型使用RoBERTa预训练模型捕获隐式情感句中字词的语义特征,再使用双向长短期记忆网络学习句子的正反向语义信息,以捕获句子间的依赖关系,实现对文本深层次特征的提取。使用注意力机制进行情感权重计算,通过softmax函数进行归一化处理,得出隐式情感分类结果。实验结果表明,与现有的几种典型隐式情感分类模型相比较,RBLA模型在精确率、召回率和F1值上均取得了较好效果。

关 键 词:自然语言处理  隐式情感分析  RoBERTa  注意力机制  双向长短期记忆网络(BiLSTM)  

Implicit Sentiment Analysis Based on RoBERTa Fused with BiLSTM and Attention Mechanism
ZHANG Jun,ZHANG Li,SHEN Fanfan,TAN Hai,HE Yanxiang.Implicit Sentiment Analysis Based on RoBERTa Fused with BiLSTM and Attention Mechanism[J].Computer Engineering and Applications,2022,58(23):142-150.
Authors:ZHANG Jun  ZHANG Li  SHEN Fanfan  TAN Hai  HE Yanxiang
Affiliation:1.College of Information Engineering, East China University of Technology, Nanchang 330013, China 2.Radiological Geosciences Big Data Engineering Laboratory of Jiangxi Province, East China University of Technology, Nanchang 330013, China 3.School of Computer Science, Wuhan University, Wuhan 430072, China 4.School of Information Engineering, Nanjing Audit University, Nanjing 211815, China
Abstract:Implicit sentiment analysis is one of the research hotspots in the field of natural language processing, which makes the traditional text sentiment analysis methods not applicable due to their implicit expressions and lack of obvious sentiment words. To address the difficulties of capturing the hidden sentiment of sentence semantics in the task of implicit sentiment analysis, a fusion model named RBLA is proposed, which is based on the pre-trained model RoBERTa fused with bidirectional long short-term memory network and attention mechanism. The proposed model RBLA can capture the semantics and syntactic features of words in the implicit sentiment sentences using the pre-trained model RoBERTa. Moreover, the proposed model also can learn forward and backward semantic information of sentences via using the bidirectional long short-term memory network, which can capture the dependencies between sentences and extract the deep-level features of the text. And RBLA still takes advantage of the attention mechanism to calculate the sentiment weights. In addition, to obtain implicit sentiment classification results, RBLA performs normalization via using the normalized function softmax. The experimental results show that RBLA can achieve better results in terms of accuracy, recall and F1 value compared with several existing typical implicit sentiment classification models.
Keywords:natural language processing  implicit sentiment analysis  RoBERTa  attention mechanism  bi-directional long short term memory(BiLSTM)  
点击此处可从《计算机工程与应用》浏览原始摘要信息
点击此处可从《计算机工程与应用》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号