首页 | 本学科首页   官方微博 | 高级检索  
     

孪生网络中文语义匹配方法的研究
引用本文:于碧辉,王加存.孪生网络中文语义匹配方法的研究[J].小型微型计算机系统,2021(2):231-234.
作者姓名:于碧辉  王加存
作者单位:中国科学院大学;中国科学院沈阳计算技术研究所
摘    要:语义匹配是问答领域的一个核心任务,能够为问答系统和信息检索等领域提供技术支持.目前对于语义匹配这一特殊分类问题,神经网络主要使用交叉熵或者对比代价损失函数,忽略了损失函数的分类限制宽泛,导致其在分类边缘存在误差.为了解决此种问题,本文在已有的孪生神经网络的基础上,引入am-softmax损失函数,提升模型精确度,同时在现有的词向量和字向量作为网络输入的基础,进一步引入Attention机制,使模型进一步获取更多的文本信息.实验结果表明,与之前的深度学习模型相比,模型的性能有进一步提高.

关 键 词:语义匹配  孪生神经网络  Am-Softmax  Bi-LSTM

Research on Chinese Semantic Matching in Siamese Network
YU Bi-hui,WANG Jia-cun.Research on Chinese Semantic Matching in Siamese Network[J].Mini-micro Systems,2021(2):231-234.
Authors:YU Bi-hui  WANG Jia-cun
Affiliation:(Chinese Academy of Sciences University,Beijing 100049,China;Chinese Academy of Sciences University,Shenyang Institute of Computing Technology,Shenyang 110168,China)
Abstract:Semantic matching is a core task in the field of question answering,which can provide technical support for questions answ ering systems and information retrieval. At present,for the special classification problem of semantic matching,neural netw orks mainly use cross entropy or contrast cost loss functions,ignoring the broad classification limitation of loss functions,w hich leads to errors at the edges of classification. In order to solve this problem,based on the existing tw in neural netw ork,this paper introduces the am-softmax loss function to improve the accuracy of the model. At the same time,the existing w ord vector and w ord vector are used as the basis of the netw ork input,and the Attention mechanism is further introduced. To make the model get more text information.Experimental results show that compared w ith previous deep learning models,the performance of the model is further improved.
Keywords:semantic matching  siamese netw ork  Am-Softmax  Bi-LSTM
本文献已被 维普 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号