首页 | 本学科首页   官方微博 | 高级检索  
     

基于Transformer编码器的金融文本情感分析方法
引用本文:李福鹏,付东翔.基于Transformer编码器的金融文本情感分析方法[J].电子科技,2020,33(9):10-15.
作者姓名:李福鹏  付东翔
作者单位:上海理工大学 光电信息与计算机工程学院,上海 200093
基金项目:国家自然科学基金(61703277);国家自然科学基金(61605114)
摘    要:目前针对文本情感分析的研究大多集中在商品评论和微博的情感分析领域,对金融文本的情感分析研究较少。针对该问题,文中提出一种基于Transformer编码器的金融文本情感分析方法。Transformer编码器是一种基于自注意力机制的特征抽取单元,在处理文本序列信息时可以把句中任意两个单词联系起来不受距离限制,克服了长程依赖问题。文中所提方法使用Transformer编码器构建情感分析网络。Transformer编码器采用多头注意力机制,对同一句子进行多次计算以捕获更多的隐含在上下文中的语义特征。文中在以金融新闻为基础构建的平衡语料数据集上进行实验,并与以卷积神经网络和循环神经网络为基础构建的模型进行对比。实验结果表明,文中提出的基于Transformer编码器的方法在金融文本情感分析领域效果最好。

关 键 词:情感分析  金融  自注意力机制  Transformer编码器  缩放点积注意力  多头注意力  
收稿时间:2019-07-04

Sentiment Analysis Method of Financial Text Based on Transformer Encoder
LI Fupeng,FU Dongxiang.Sentiment Analysis Method of Financial Text Based on Transformer Encoder[J].Electronic Science and Technology,2020,33(9):10-15.
Authors:LI Fupeng  FU Dongxiang
Affiliation:School of Optical-Electrical and Computer Engineering,University of Shanghai for Science and Technology, Shanghai 200093,China
Abstract:Sentiment analysis plays an important role in many fields, but most of these studies focus on the field of commodity reviews and microblogs, and lack of sentiment analysis of financial texts. To solve this problem, a financial text sentiment analysis method based on Transformer encoder is presented in this paper. Transformer encoder is a feature extraction unit based on the self-attention mechanism. When Transformer encoder processes text sequence information, it can link any two words in a sentence without distance restriction, which overcomes the problem of long-term dependencies. The multi-head attention mechanism is used to calculate the same sentence several times and capture more semantic features implied in the context. The experiment is car+ried out on a balanced corpus data set based on financial news. The experimental results show that the method based on Transformer encoder has best effect in the field of financial text sentiment analysis compared with the models based on convolution neural network and recurrent neural network.
Keywords:sentiment analysis  finance  self-attention mechanism  transformer encoder  scaled dot-product attention  multi-head attention  
点击此处可从《电子科技》浏览原始摘要信息
点击此处可从《电子科技》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号