首页 | 本学科首页   官方微博 | 高级检索  
     

面向文本分类的多头注意力池化RCNN模型
引用本文:翟一鸣,王斌君,周枝凝,仝鑫. 面向文本分类的多头注意力池化RCNN模型[J]. 计算机工程与应用, 2021, 57(12): 155-160. DOI: 10.3778/j.issn.1002-8331.2003-0276
作者姓名:翟一鸣  王斌君  周枝凝  仝鑫
作者单位:中国人民公安大学 警务信息工程与网络安全学院,北京 100038
摘    要:针对经典循环卷积神经网络(RCNN)在池化层采用的最大池化策略较为单一,会忽略除最突出特征外的其他特征,影响分类精度的问题,提出基于多头注意力池化的循环卷积神经网络(MHAP-RCNN)模型.多头注意力池化可以充分考虑各特征对分类的贡献,且能在训练过程中动态优化,有效缓解最大池化的单一性问题.在三个公开的文本分类数据集...

关 键 词:文本分类  循环卷积神经网络  池化  最大池化  多头注意力池化

Multi-head Attention Pooling-Based RCNN Model for Text Classification
ZHAI Yiming,WANG Binjun,ZHOU Zhining,TONG Xin. Multi-head Attention Pooling-Based RCNN Model for Text Classification[J]. Computer Engineering and Applications, 2021, 57(12): 155-160. DOI: 10.3778/j.issn.1002-8331.2003-0276
Authors:ZHAI Yiming  WANG Binjun  ZHOU Zhining  TONG Xin
Affiliation:College of Police Information Engineering and Cyber Security, People’s Public Security University of China, Beijing 100038, China
Abstract:The strategy of max pooling in the pooling layer adopted by classic Recurrent Convolutional Neural Network(RCNN) is relatively onefold, which will ignore other features except the most prominent one and affect the classification accuracy. Therefore, a Multi-Head Attention Pooling-based Recurrent Convolutional Neural Network(MHAP-RCNN) is proposed. The mechanism of multi-head attention pooling can fully consider the contribution of each feature to classification, and can be dynamically optimized in the training process, which can effectively alleviate the above problem of max pooling. Experiments are performed on three public text classification data sets. The results show that the proposed model has better performance on text classification than classic RCNN and other models.
Keywords:text classification  recurrent convolutional neural network  pooling  max pooling  multi-head attention pooling  
本文献已被 万方数据 等数据库收录!
点击此处可从《计算机工程与应用》浏览原始摘要信息
点击此处可从《计算机工程与应用》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号