首页 | 本学科首页   官方微博 | 高级检索  
     

基于深度神经网络和门控循环单元的动态图表示学习方法
引用本文:李慧博,赵云霄,白亮.基于深度神经网络和门控循环单元的动态图表示学习方法[J].计算机应用,2021,41(12):3432-3437.
作者姓名:李慧博  赵云霄  白亮
作者单位:计算机智能与中文信息处理教育部重点实验室(山西大学),太原 030006
山西大学 计算机与信息技术学院,太原 030006
基金项目:国家自然科学基金资助项目(62022052);国家重点研发计划项目(2020AAA0106100);山西省基础研究计划项目(201901D211192)
摘    要:学习图中节点的潜在向量表示是一项重要且普遍存在的任务,旨在捕捉图中节点的各种属性。大量工作证明静态图表示已经能够学习到节点的部分信息,然而,真实世界的图是随着时间的推移而演变的。为了解决多数动态网络算法不能有效保留节点邻域结构和时态信息的问题,提出了基于深度神经网络(DNN)和门控循环单元(GRU)的动态网络表示学习方法DynAEGRU。该方法以自编码器作为框架,其中的编码器首先用DNN聚集邻域信息以得到低维特征向量,然后使用GRU网络提取节点时态信息,最后用解码器重构邻接矩阵并将其与真实图对比来构建损失。通过与几种静态图和动态图表示学习算法在3个数据集上进行实验分析,结果表明DynAEGRU具有较好的性能增益。

关 键 词:动态网络表示学习  深度神经网络  自编码器  门控循环单元  链路预测  
收稿时间:2021-05-12
修稿时间:2021-06-28

Dynamic graph representation learning method based on deep neural network and gated recurrent unit
LI Huibo,ZHAO Yunxiao,BAI Liang.Dynamic graph representation learning method based on deep neural network and gated recurrent unit[J].journal of Computer Applications,2021,41(12):3432-3437.
Authors:LI Huibo  ZHAO Yunxiao  BAI Liang
Affiliation:Key Laboratory of Computational Intelligence and Chinese Information Processing of Ministry of Education,Taiyuan Shanxi 030006,China
School of Computer and Information Technology,Shanxi University,Taiyuan Shanxi 030006,China
Abstract:Learning the latent vector representations of nodes in the graph is an important and ubiquitous task, which aims to capture various attributes of the nodes in the graph. A lot of work demonstrates that static graph representation learning can learn part of the node information; however, real-world graphs evolve over time. In order to solve the problem that most dynamic network algorithms cannot effectively retain node neighborhood structure and temporal information, a dynamic network representation learning method based on Deep Neural Network (DNN) and Gated Recurrent Unit (GRU), namely DynAEGRU, was proposed. With Auto-Encoder (AE) as the framework of the DynAEGRU, the neighborhood information was aggregated by encoder with a DNN to obtain low-dimensional feature vectors, then the node temporal information was extracted by a GRU network,finally, the adjacency matrix was reconstructed by the decoder and compared with the real graph to construct the loss. Experimental results on three real-word datasets show that DynAEGRU method has better performance gain compared to several static and dynamic graph representation learning algorithms.
Keywords:dynamic network representation learning  Deep Neural Network (DNN)  Auto-Encoder (AE)  Gated Recurrent Unit (GRU)  link prediction  
本文献已被 万方数据 等数据库收录!
点击此处可从《计算机应用》浏览原始摘要信息
点击此处可从《计算机应用》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号