首页 | 本学科首页   官方微博 | 高级检索  
     

基于双通道词向量的ACRNN文本分类
引用本文:邢鑫,孙国梓. 基于双通道词向量的ACRNN文本分类[J]. 计算机应用研究, 2021, 38(4): 1033-1037. DOI: 10.19734/j.issn.1001-3695.2020.05.0127
作者姓名:邢鑫  孙国梓
作者单位:南京邮电大学 计算机学院,南京210023
基金项目:信息网络安全公安部重点实验室开放课题基金资助项目;国家自然科学基金资助项目;数学工程与先进计算国家重点实验室开放基金资助项目
摘    要:常见的文本分类模型多基于循环神经网络和卷积神经网络这两种结构进行模型的堆叠构建,这种层叠式结构虽然能够提取更加高维的深层次语义信息,但在不同结构连接的同时,造成一部分有效特征信息的丢失。为了解决这一问题,提出一种基于双通道词向量的分类模型,该模型使用结合注意力机制的Bi-LSTM和CNN以更加浅层的结构对文本表征进行有效的特征提取。此外,提出一种新的将文本表征成前向、后向两种形式并利用CNN进行特征提取的方法。通过在两种不同的五分类数据集上进行分类实验并与多种基准模型对比,验证了该模型的有效性,表明该模型较层叠式结构模型效果更好。

关 键 词:文本分类  双向长短期记忆网络  卷积神经网络  注意力机制  双通道  词向量
收稿时间:2020-05-26
修稿时间:2021-03-09

Dual-channel word vectors based ACRNN for text classification
Xing Xin and Sun Guozi. Dual-channel word vectors based ACRNN for text classification[J]. Application Research of Computers, 2021, 38(4): 1033-1037. DOI: 10.19734/j.issn.1001-3695.2020.05.0127
Authors:Xing Xin and Sun Guozi
Affiliation:(School of Computer Science,Nanjing University of Posts&Telecommunications,Nanjing 210023,China)
Abstract:Common models of text classification are mostly constructed with recurrent neural network and convolutio-nal neural network in a stacked way.Although this stacked structure can extract more high-dimensional and deeper semantic information,a part of the effective feature information is also dropped when different structures are connected.In order to solve the above problem,this paper proposed a classification model based on dual-channel word vectors,and the model used a shallower structure with attention-mechanism-based Bi-LSTM and CNN to extract features of text representation effectively.In addition,this paper presented a new method to characterize text into two forms,forward and backward,and used CNN to extract feature information of the text.By conducting classification experiments on two different five-classification datasets and comparing with a variety of benchmark models,it verifies that the model is effective and the results show that this model is superior to the other models with stacked structure.
Keywords:text classification  bidirectional long short term memory network(Bi-LSTM)  convolutional neural network(CNN)  attention mechanism  dual-channel  word vector
本文献已被 维普 万方数据 等数据库收录!
点击此处可从《计算机应用研究》浏览原始摘要信息
点击此处可从《计算机应用研究》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号