首页 | 本学科首页   官方微博 | 高级检索  
     

改进的SemBERT特征重组模型
引用本文:龚安靖,陈红英.改进的SemBERT特征重组模型[J].计算机系统应用,2022,31(11):207-214.
作者姓名:龚安靖  陈红英
作者单位:华南师范大学 计算机学院, 广州 510631
基金项目:广东省自然科学基金 (2020A1515010445)
摘    要:SemBERT模型实现了对BERT模型的提升, 但存在两个明显的缺陷. 第一, 模型获得向量表示的能力不足. 第二, 没有从特征的种类出发, 直接使用传统特征进行任务分类. 针对这两个缺陷, 提出了一种新的特征重组网络. 该模型在SemBERT内部添加自注意力机制, 外接特征重组机制, 得到更好的向量表示并且重新分配特征权重. 实验数据表明新的方法在MRPC数据集上比经典的SemBERT模型在F1值上提高了1%. 实现在小数据集上的明显提升, 并且超越了大多数优秀模型.

关 键 词:特征重组  向量表示  自注意力机制  特征权重  深度学习  自然语言处理
收稿时间:2022/2/20 0:00:00
修稿时间:2022/3/21 0:00:00

Improved SemBERT-based Feature Reorganization Model
Gong An-Jing,CHEN Hong-Ying.Improved SemBERT-based Feature Reorganization Model[J].Computer Systems& Applications,2022,31(11):207-214.
Authors:Gong An-Jing  CHEN Hong-Ying
Affiliation:School of Computer, South China Normal University, Guangzhou 510631, China
Abstract:Although the SemBERT model is an improved version of the BERT model, it has two obvious defects. One is its poor ability to obtain vector representation. The other is that it directly uses conventional features to classify tasks without considering the category of the features. A new feature reorganization network is proposed to address these two defects. This model adds a self-attention mechanism into the SemBERT model and obtains better vector representation with an external feature reorganization mechanism. Feature weights are also reassigned. Experimental data show that the F1 score of the new method on the Microsoft Research Paraphrase Corpus (MRPC) dataset is one percentage point higher than that of the classical SemBERT model. The proposed model has significantly improved performance on small datasets, and it outperforms most of the current outstanding models.
Keywords:feature reorganization  vector representation  self-attention mechanism  feature weight  deep learning  natural language processing (NLP)
点击此处可从《计算机系统应用》浏览原始摘要信息
点击此处可从《计算机系统应用》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号