首页 | 本学科首页   官方微博 | 高级检索  
     

基于词语关系的词向量模型
引用本文:蒋振超,李丽双,黄德根.基于词语关系的词向量模型[J].中文信息学报,2017,31(3):25-31.
作者姓名:蒋振超  李丽双  黄德根
作者单位:大连理工大学 计算机科学与技术学院,辽宁 大连 116024
基金项目:国家自然科学基金(61672126、61173101)
摘    要:词向量能够以向量的形式表示词的意义,近来许多自然语言处理应用中已经融入词向量,将其作为额外特征或者直接输入以提升系统性能。然而,目前的词向量训练模型大多基于浅层的文本信息,没有充分挖掘深层的依存关系。词的词义体现在该词与其他词产生的关系中,而词语关系包含关联单位、关系类型和关系方向三个属性,因此,该文提出了一种新的基于神经网络的词向量训练模型,它具有三个顶层,分别对应关系的三个属性,更合理地利用词语关系对词向量进行训练,借助大规模未标记文本,利用依存关系和上下文关系来训练词向量。将训练得到的词向量在类比任务和蛋白质关系抽取任务上进行评价,以验证关系模型的有效性。实验表明,与skip-gram模型和CBOW模型相比,由关系模型训练得到的词向量能够更准确地表达词语的语义信息。

关 键 词:词表示  词嵌入  词向量  神经网络  关系模型  

Word Representation Based on Word Relations
JIANG Zhenchao,LI Lishuang,HUANG Degen.Word Representation Based on Word Relations[J].Journal of Chinese Information Processing,2017,31(3):25-31.
Authors:JIANG Zhenchao  LI Lishuang  HUANG Degen
Affiliation:School of Computer Science and Technology, Dalian University of Technology, Dalian,Liaoning 116024, China
Abstract:In natural language processing tasks, distributed word representation has succeeded in capturingsemantic regularities and have been used as extra features. However, most word representation model arebased shallow context-window, which are not enough to express the meaning of words. The essence of wordmeaning lies in the word relations, which consist of three elements: relation type, relation direction and relateditems. In this paper, we leverage a large set of unlabeled texts, to make explicit the semantic regularity toemerge in word relations, including dependency relations and context relations, and put forward a novelarchitecture for computing continuous vector representation. We define three different top layers in the neuralnetwork architecture as corresponding to relation type, relation direction and related words, respectively.Different from other models, the relation model can use the deep syntactic information to train wordrepresentations. Tested in word analogy task and Protein-Protein Interaction Extraction task, the results showthat relation model performs overall better than others to capture semantic regularities.
Keywords:word representation  word embedding  word vectors  neural network  relation model  
点击此处可从《中文信息学报》浏览原始摘要信息
点击此处可从《中文信息学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号