首页 | 本学科首页   官方微博 | 高级检索  
     

基于CNNCIFG-Attention模型的文本情感分类
引用本文:李辉,王一丞.基于CNNCIFG-Attention模型的文本情感分类[J].电子科技,2022,35(2):46-51.
作者姓名:李辉  王一丞
作者单位:1.河南理工大学 物理与电子信息学院,河南 焦作 4540002.河南理工大学 电气工程与自动化学院,河南 焦作 454000
摘    要:神经网络在处理中文文本情感分类任务时,文本显著特征提取能力较弱,学习速率也相对缓慢.针对这一问题,文中提出一种基于注意力机制的混合网络模型.首先对文本语料进行预处理,利用传统的卷积神经网络对样本向量的局部信息进行特征提取,并将其输入耦合输入和遗忘门网络模型,用以学习前后词句之间的联系.随后,再加入注意力机制层,对深层次...

关 键 词:情感分类  混合网络模型  卷积神经网络  特征提取  耦合输入和遗忘门网络  注意力机制  权重分配  准确率  F-Score数值
收稿时间:2020-10-13

CNNCIFG-Attention Model for Text Sentiment Classifcation
LI Hui,WANG Yicheng.CNNCIFG-Attention Model for Text Sentiment Classifcation[J].Electronic Science and Technology,2022,35(2):46-51.
Authors:LI Hui  WANG Yicheng
Affiliation:1. School of Physics and Electronic Information Engineering,Henan Polytechnic University,Jiaozuo 454000,China2. School of Electrical Engineering and Automation,Henan Polytechnic University,Jiaozuo,454000,China
Abstract:Neural networks are weak in text salient feature extraction and have relatively slow learning rate in processing Chinese text sentiment classification tasks. To solve this problem, this study proposes a hybrid network model based on attention mechanism. This study preprocesses the text corpus, uses the traditional convolutional neural network to extract the feature of the local information of the sample vector. Then, extracted features are input into the coupled input and forget gate network model to learn the connection between the preceding and following words and sentences. Subsequently, the attention mechanism layer is added to assign weights to deep-level text information to improve the intensity of the influence of important information on text sentiment classification. Finally, the proposed hybrid network model is tested on the crawled JD product review collection. The test results show that the accuracy of the proposed method reaches 92.13%, and the F-Score value is 92.06%, which proves the feasibility of the CNNCIFG-Attention model.
Keywords:sentiment classification  hybrid network model  convolutional neural network  feature extraction  coupled input and forget gate network  attention model  weight distribution  accuracy  F-Score value  
本文献已被 万方数据 等数据库收录!
点击此处可从《电子科技》浏览原始摘要信息
点击此处可从《电子科技》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号