首页 | 本学科首页   官方微博 | 高级检索  
     

基于ELMo和Transformer混合模型的情感分析
引用本文:赵亚欧,张家重,李贻斌,王玉奎. 基于ELMo和Transformer混合模型的情感分析[J]. 中文信息学报, 2021, 35(3): 115-124
作者姓名:赵亚欧  张家重  李贻斌  王玉奎
作者单位:1.浪潮集团 金融信息技术有限公司,山东 济南 250101;
2.济南大学 信息科学与工程学院,山东 济南 250022;
3.山东大学 控制科学与工程学院,山东 济南 250061
基金项目:国家重点研发计划云计算和大数据重点专项(2016YFB1001100,2016YFB1001104);国家自然科学基金青年项目(61702218)
摘    要:针对循环神经网络模型无法直接提取句子的双向语义特征,以及传统的词嵌入方法无法有效表示一词多义的问题,该文提出了基于ELMo和Transformer的混合模型用于情感分类.首先,该模型利用ELMo模型生成词向量.基于双向LSTM模型,ELMo能够在词向量中进一步融入词语所在句子的上下文特征,并能针对多义词的不同语义生成不...

关 键 词:情感分析  ELMo模型  Transformer模型  多头自注意力机制  自然语言处理
收稿时间:2019-12-03

Sentiment Analysis Based on Hybrid Model of ELMo and Transformer
ZHAO Yaou,ZHANG Jiachong,LI Yibin,WANG Yukui. Sentiment Analysis Based on Hybrid Model of ELMo and Transformer[J]. Journal of Chinese Information Processing, 2021, 35(3): 115-124
Authors:ZHAO Yaou  ZHANG Jiachong  LI Yibin  WANG Yukui
Affiliation:1.Inspur Financial Information Technology Company Limited, Jinan, Shandong 250101, China;2.School of Information Science and Engineering, University of Jinan, Jinan, Shandong 250022, China;3.School of Control Science and Engineering, Shandong University, Jinan, Shandong 250061, China
Abstract:A hybrid model based on ELMo (Embeddings from Language Models) and Transformer is proposed for sentimental analysis. Firstly, the ELMo model based on bilateral LSTM model is applied to generate word vectors that combine the contexts features and word features, with different vectors for different meanings of a polysemous word. Then, the ELMo vector is input into a Transformer with the encoder and decoder modified for sentiment classification. The hybrid model of ELMo and Transformer with two different network structures can extract the semantic features of sentences from different aspects. The experimental results show that, compared with state-of-the-arts methods, the proposed model improves the accuracy by 3.52% on NLPCC2014 Task2 datasets, by 0.7%, 2%, 1.98% and 1.36% on 4 sub-datasets of hotel reviews respectively.
Keywords:sentiment analysis    embeddings from language models    transformer model    multi-heads self-attention mechanism    natural language processing  
点击此处可从《中文信息学报》浏览原始摘要信息
点击此处可从《中文信息学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号