首页 | 本学科首页   官方微博 | 高级检索  
     

基于短语注意机制的文本分类
引用本文:江伟,金忠.基于短语注意机制的文本分类[J].中文信息学报,2018,32(2):102.
作者姓名:江伟  金忠
作者单位:1.南京理工大学 计算机科学与工程学院,江苏 南京 210094;
2.南京理工大学 高维信息智能感知与系统教育部重点实验室,江苏 南京 210094
基金项目:国家自然科学基金(61373063,61375007,61233011,91420201,61472187);国家重点基础研究发展计划(2014CB349303)
摘    要:基于词注意机制的双向循环神经网络在解决文本分类问题时,存在如下问题:直接对词加权生成文本表示会损失大量信息,从而难以在小规模数据集上训练网络。此外,词必须结合上下文构成短语才具有明确语义,且文本语义常常是由其中几个关键短语决定,所以通过学习短语的权重来合成的文本语义表示要比通过学习词的权重来合成的更准确。为此,该文提出一种基于短语注意机制的神经网络框架NN-PA。其架构是在词嵌入层后加入卷积层提取N-gram短语的表示,再用带注意机制的双向循环神经网络学习文本表示。该文还尝试了五种注意机制。实验表明: 基于不同注意机制的NN-PA系列模型不仅在大、小规模数据集上都能明显提高分类正确率,而且收敛更快。其中,模型NN-PA1和NN-PA2明显优于主流的深度学习模型,且NN-PA2在斯坦福情感树库数据集的五分类任务上达到目前最高的正确率53.35%。

关 键 词:文本分类  循环神经网络  卷积层  注意机制  

Text Classification Based on Phrase Attention Mechanism
JIANG Wei,JIN Zhong.Text Classification Based on Phrase Attention Mechanism[J].Journal of Chinese Information Processing,2018,32(2):102.
Authors:JIANG Wei  JIN Zhong
Affiliation:1.School of Computer Science and Engineering, Nanjing University of Science & Technology, Nanjing, Jiangsu 210094 , China;
2.MOE Key Laboratory of Intelligent Perception and System for High-Dimensional Information, Nanjing University of Science & Technology, Nanjing, Jiangsu 210094, China
Abstract:In text classification, bidirectional recurrent neural network based on word-level attention is defected in the way generating text representation directly from words, which will cause a lot of information loss and make it hard to train the network on a limited data. In fact, words need to be combined into phrases with clear semantics in the context, and the text semantic meaning is often determined by several key phrases, therefore, the text representation generated by learning the weight of phrases may be more precise than that by the words. This paper proposes a novel neural network architecture based on the phrase-level attention mechanism. Specifically, a convolutional layer is added after the word embedding layer to extract the representations of N-gram phrase, and the text representation is learnt by bidirectional recurrent neural network with attention mechanism. We test five kinds of attention mechanisms in the experiment. Experimental results show that a series of NN-PA models based on different attention mechanism can improve classification performance on both of small and large scale datasets, and converge faster. Both NN-PA1 and NN-PA2 models outperform the state-of-art models based on deep learning techniques, and NN-PA2 gets 53.35% accuracy on the five-classification task on Stanford Sentiment Treebank, which is best result to our best knowledge.
Keywords:text classification  recurrent neural network  convolutional layer  attention mechanism  
点击此处可从《中文信息学报》浏览原始摘要信息
点击此处可从《中文信息学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号