首页 | 本学科首页   官方微博 | 高级检索  
     

基于多任务学习的知识库问答方法
引用本文:金书川,朱艳辉,沈加锐,满芳藤,张志轩. 基于多任务学习的知识库问答方法[J]. 湖南工业大学学报, 2024, 38(3): 38-44
作者姓名:金书川  朱艳辉  沈加锐  满芳藤  张志轩
作者单位:湖南工业大学 计算机学院湖南省智能信息感知及处理技术重点实验室
基金项目:国家自然科学基金资助项目(62106074);湖南省教育厅科研基金资助重点项目(22A0408,21A0350);湖南省自然科学基金资助项目(2022JJ50051)
摘    要:针对知识库问答传统流水线方法中容易出现错误传递且不同子任务之间缺乏联系的问题,提出一种新的知识库问答系统方法,将多任务学习引入知识库问答系统,从而改进知识库问答系统效果。让多个子任务共享一个编码器,促使模型学习到更好的底层表达,提高了模型的泛化能力。在CCKS2022-CKBQA任务上的实验结果表明,所提方法取得了较好的效果。

关 键 词:知识库问答;自然语言处理;BERT;多任务学习
收稿时间:2023-03-18

A Knowledge Base Question-Answer Method Based on Multi-Task Learning
JIN Shuchuan,ZHU Yanhui,SHEN Jiarui,MAN Fangteng,ZHANG Zhixuan. A Knowledge Base Question-Answer Method Based on Multi-Task Learning[J]. Journal of Hnnnan University of Technology, 2024, 38(3): 38-44
Authors:JIN Shuchuan  ZHU Yanhui  SHEN Jiarui  MAN Fangteng  ZHANG Zhixuan
Affiliation:College of Computer Science,Hunan University of TechnologyKey Laboratory of Intelligent Information Perception and Processing Technology of Hunan Province
Abstract:In view of such flaws as error transmission or loose connection between different subtasks found in the traditional assembly line method of knowledge base Q&A, a new method of knowledge base Q&A system has been proposed, with multi-task learning incorporated into the knowledge base quiz system so as to improve its effectiveness. Allowing multiple subtasks to share a single encoder enables the model to acquire a better underlying representation, thus helping to improve the generalization ability of the model. Experimental results on the CCKS2022-CKBQA task verifies the better performance of the proposed method in this paper.
Keywords:knowledge base Q&A;natural language processing (NLP);BERT;multi-task learning
点击此处可从《湖南工业大学学报》浏览原始摘要信息
点击此处可从《湖南工业大学学报》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号