当前位置: X-MOL 学术arXiv.cs.SI › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Pre-Training on Dynamic Graph Neural Networks
arXiv - CS - Social and Information Networks Pub Date : 2021-02-24 , DOI: arxiv-2102.12380
Jiajun Zhang, Kejia Chen, Yunyun Wang

The pre-training on the graph neural network model can learn the general features of large-scale networks or networks of the same type by self-supervised methods, which allows the model to work even when node labels are missing. However, the existing pre-training methods do not take network evolution into consideration. This paper proposes a pre-training method on dynamic graph neural networks (PT-DGNN), which uses dynamic attributed graph generation tasks to simultaneously learn the structure, semantics, and evolution features of the graph. The method includes two steps: 1) dynamic sub-graph sampling, and 2) pre-training with dynamic attributed graph generation task. Comparative experiments on three realistic dynamic network datasets show that the proposed method achieves the best results on the link prediction fine-tuning task.

中文翻译:

动态图神经网络的预训练

图神经网络模型上的预训练可以通过自我监督的方法来学习大型网络或相同类型网络的一般特征,即使在缺少节点标签的情况下,模型也可以工作。但是,现有的预训练方法没有考虑网络演进。本文提出了一种动态图神经网络(PT-DGNN)的预训练方法,该方法使用动态属性图生成任务来同时学习图的结构,语义和演化特征。该方法包括两个步骤:1)动态子图采样,以及2)具有动态属性图生成任务的预训练。在三个现实的动态网络数据集上的比较实验表明,该方法在链路预测微调任务上取得了最佳的效果。
更新日期:2021-02-25
down
wechat
bug