首页 | 本学科首页   官方微博 | 高级检索  
     

基于BERT-FNN的意图识别分类
引用本文:郑新月,任俊超.基于BERT-FNN的意图识别分类[J].计算机与现代化,2021,0(7):71-76.
作者姓名:郑新月  任俊超
作者单位:东北大学理学院,辽宁 沈阳 110004
基金项目:国家自然科学基金资助项目(61673100, 61703083); 中央高校基本科研业务费专项资金项目(N150504011)
摘    要:意图识别分类是自然语言处理领域的一个热点问题,在智能机器人、智能客服中如何根据上下文理解用户意图是一个重点问题,同时也是一个难点问题。传统的意图识别分类主要是采用基于规则、模板匹配的正则化方法或基于机器学习的方法,然而却存在计算成本高、泛化能力差的问题。针对上述问题,本文设计以Google公开的BERT预训练语言模型为基础,进行输入文本的上下文建模和句级别的语义表示,采用[cls]符号(token)对应的向量代表文本的上下文,再通过全连接神经网络(FNN)对语句进行特征提取,为了充分利用数据,本文利用拆解法的思想,将多分类问题转换成多个二分类问题处理,每次将一个类别作为正例,其余类别均作为负例,产生多个二分类任务,从而实现意图分类。实验结果表明,该方法性能优于传统模型,可以获得94%的准确率。

关 键 词:自然语言处理    意图识别    BERT    FNN    拆解法  
收稿时间:2021-08-02

Intention Recognition and Classification Based on BERT-FNN
ZHENG Xin-yue,REN Jun-chao.Intention Recognition and Classification Based on BERT-FNN[J].Computer and Modernization,2021,0(7):71-76.
Authors:ZHENG Xin-yue  REN Jun-chao
Abstract:Intention recognition classification is an important question in the field of natural language processing. How to understand the user’s intention based on context is a key and difficult problem in intelligent robots and intelligent customer service. Traditional intention recognition classification is mainly based on regularization methods or machine learning methods. However, there are problems of high computational cost and poor generalization ability. In response to the above problems, the design of this paper is based on Google’s BERT pre-training language model to perform context modeling and sentence-level semantic representation of the text, uses the vector corresponding to the [cls] token to represent the context of the text, then, extracts the feature of sentences through fully-connected neural network (FNN). In order to make full use of the data, this paper uses the idea of disassembly method to convert the multi-classification problem into multiple binary classification problems. Each time, one category is used as a positive example, and the remaining categories are used as negative examples, which generates multiple two-classification tasks so as to achieve intention classification. Experimental results show that the performance of this method is better than the traditional model, and the accuracy of this method is 94%.
Keywords:natural language processing  intention recognition  BERT  FNN  dismantling method  
本文献已被 万方数据 等数据库收录!
点击此处可从《计算机与现代化》浏览原始摘要信息
点击此处可从《计算机与现代化》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号