当前位置: X-MOL 学术Comput. Intell. Neurosci. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Hybrid Low-Order and Higher-Order Graph Convolutional Networks.
Computational Intelligence and Neuroscience ( IF 3.120 ) Pub Date : 2020-06-23 , DOI: 10.1155/2020/3283890
Fangyuan Lei 1, 2 , Xun Liu 2 , Qingyun Dai 1 , Bingo Wing-Kuen Ling 3 , Huimin Zhao 4 , Yan Liu 2
Affiliation  

With the higher-order neighborhood information of a graph network, the accuracy of graph representation learning classification can be significantly improved. However, the current higher-order graph convolutional networks have a large number of parameters and high computational complexity. Therefore, we propose a hybrid lower-order and higher-order graph convolutional network (HLHG) learning model, which uses a weight sharing mechanism to reduce the number of network parameters. To reduce the computational complexity, we propose a novel information fusion pooling layer to combine the high-order and low-order neighborhood matrix information. We theoretically compare the computational complexity and the number of parameters of the proposed model with those of the other state-of-the-art models. Experimentally, we verify the proposed model on large-scale text network datasets using supervised learning and on citation network datasets using semisupervised learning. The experimental results show that the proposed model achieves higher classification accuracy with a small set of trainable weight parameters.

中文翻译:

混合低阶和高阶图卷积网络。

利用图网络的高阶邻域信息,可以显着提高图表示学习分类的准确性。然而,当前的高阶图卷积网络具有大量参数和高计算复杂度。因此,我们提出了一种混合的低阶和高阶图卷积网络(HLHG)学习模型,该模型使用权重共享机制来减少网络参数的数量。为了降低计算复杂度,我们提出了一种新颖的信息融合池层,以结合高阶和低阶邻域矩阵信息。我们从理论上比较了所提出模型的计算复杂度和参数数量与其他最新模型的计算复杂度和参数数量。实验上,我们使用监督学习在大型文本网络数据集上以及使用半监督学习在引用网络数据集上验证了该模型。实验结果表明,提出的模型具有少量可训练的权重参数,可以实现较高的分类精度。
更新日期:2020-06-23
down
wechat
bug