首页 | 本学科首页   官方微博 | 高级检索  
     

基于多头注意力和BiLSTM改进DAM模型的中文问答匹配方法
引用本文:秦汉忠,于重重,姜伟杰,赵霞. 基于多头注意力和BiLSTM改进DAM模型的中文问答匹配方法[J]. 中文信息学报, 2021, 35(11): 118-126
作者姓名:秦汉忠  于重重  姜伟杰  赵霞
作者单位:北京工商大学 人工智能学院,北京 100048
基金项目:教育部人文社会科学研究规划基金(16YJAZH072);国家社会科学基金(14ZDB156)
摘    要:针对目前检索式多轮对话深度注意力机制模型(Deep Attention Matching Network,DAM)候选回复细节不匹配和语义混淆的问题,该文提出基于多头注意力和双向长短时记忆网络(BiLSTM)改进DAM模型的中文问答匹配方法,采用多头注意力机制,使模型有能力建模较长的多轮对话,更好地处理目标回复与上下文的匹配关系。此外,该文在特征融合过程中采用BiLSTM模型,通过捕获多轮对话中的序列依赖关系,进一步提升选择目标候选回复的准确率。该文在豆瓣和电商两个开放数据集上进行实验,实验性能均优于DAM基线模型,R10@1指标在含有词向量增强的情况下提升了1.5%。

关 键 词:检索式多轮对话  DAM  多头注意力  BiLSTM  
收稿时间:2021-02-21

Improved DAM Model Based on Multi-head Attention and BiLSTM for Chinese Question Answering
QIN Hanzhong,YU Chongchong,JIANG Weijie,ZHAO Xia. Improved DAM Model Based on Multi-head Attention and BiLSTM for Chinese Question Answering[J]. Journal of Chinese Information Processing, 2021, 35(11): 118-126
Authors:QIN Hanzhong  YU Chongchong  JIANG Weijie  ZHAO Xia
Affiliation:School of Artificial Intelligence, Beijing Technology and Business University, Beijing 100048, China
Abstract:To effectively match response details and avoid semantic confusion, this paper proposes to improve the Deep Attention Matching Network(DAM) viamulti-head attention and Bi-directional Long Short-Term Memory (BiLSTM). This method can model longer multi-round of dialogue and handle the matching relationship between the response selection and the context. In addition, the BiLSTM Network applied in the feature fusion process can improve the accuracy of multi-turn response selection tasks by capturing the time-dependent relation. Tested on two public multi-turn response selection datasets, the Douban Conversion Corpus and the E-commerce Dialogue Corpus, our model is revealed to outperform the baseline model by 1.5% in R10@1 with the word vector enhancement.
Keywords:multi-turn response selection    deep attention matching    multi-head attention    bi-directional long short-term memory  
点击此处可从《中文信息学报》浏览原始摘要信息
点击此处可从《中文信息学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号