首页 | 本学科首页   官方微博 | 高级检索  
     

问答中的问句意图识别和约束条件分析
引用本文:孙鑫,王厚峰.问答中的问句意图识别和约束条件分析[J].中文信息学报,2017,31(6):132-139.
作者姓名:孙鑫  王厚峰
作者单位:北京大学 计算语言学教育部重点实验室,北京 100871
基金项目:国家自然科学基金(61370117,61433015)
摘    要:意图识别和约束条件分析是口语理解(SLU)中的两个重要过程。前者是分类问题,判断话语意图;后者可以看作序列标注问题,给关键信息标特定标签。该文提出了一种LSTM联合模型,同时结合了CRF和注意力机制。在ID问题上,将所有词语输出层向量的加权和用于分类;在SF问题上,考虑标签之间的转移,计算标签序列在全局的可能性。在中文数据集和ATIS英文数据集上的实验验证了该文所提方法的有效性。

关 键 词:长短期记忆网络  条件随机场  注意力机制  

Intent Determination and Slot Filling inQuestion Answering
SUN Xin,WANG Houfeng.Intent Determination and Slot Filling inQuestion Answering[J].Journal of Chinese Information Processing,2017,31(6):132-139.
Authors:SUN Xin  WANG Houfeng
Affiliation:MOE Key Lab of Computational Linguistics, Peking University, Beijing 100871, China
Abstract:Intent determination (ID) and slot filling (SF) are two major tasks in spoken language understanding (SLU). The former is a classification problem, which judges the intention of utterance. The later can be treated as a sequence labeling problem, assigning key information as specific symbols. This paper proposes a LSTM (long short-term memory network) joint model combined with attention and CRF (conditional random field). In ID problem, the weighted sum of output layer’s vectors is used in classification task as the utterance’s representation. In SF problem, this paper consideres the transfers between labels and computed probabilities on the sequence-level. This model is verified on both Chinese and ATIS English corpora.
Keywords:long short-term memory network  conditional random field  attention model  
点击此处可从《中文信息学报》浏览原始摘要信息
点击此处可从《中文信息学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号