当前位置: X-MOL 学术Expert Syst. Appl. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Graph convolutional neural networks with node transition probability-based message passing and DropNode regularization
Expert Systems with Applications ( IF 7.5 ) Pub Date : 2021-02-18 , DOI: 10.1016/j.eswa.2021.114711
Tien Huu Do , Duc Minh Nguyen , Giannis Bekoulis , Adrian Munteanu , Nikos Deligiannis

Graph convolutional neural networks (GCNNs) have received much attention recently, owing to their capability in handling graph-structured data. Among the existing GCNNs, many methods can be viewed as instances of a neural message passing motif; features of nodes are passed around their neighbors, aggregated and transformed to produce better nodes’ representations. Nevertheless, these methods seldom use node transition probabilities, a measure that has been found useful in exploring graphs. Furthermore, when the transition probabilities are used, their transition direction is often improperly considered in the feature aggregation step, resulting in an inefficient weighting scheme. In addition, although a great number of GCNN models with increasing level of complexity have been introduced, the GCNNs often suffer from over-fitting when being trained on small graphs. Another issue of the GCNNs is over-smoothing, which tends to make nodes’ representations indistinguishable. This work presents a new method to improve the message passing process based on node transition probabilities by properly considering the transition direction, leading to a better weighting scheme in nodes’ features aggregation compared to the existing counterpart. Moreover, we propose a novel regularization method termed DropNode to address the over-fitting and over-smoothing issues simultaneously. DropNode randomly discards part of a graph, thus it creates multiple deformed versions of the graph, leading to data augmentation regularization effect. Additionally, DropNode lessens the connectivity of the graph, mitigating the effect of over-smoothing in deep GCNNs. Extensive experiments on eight benchmark datasets for node and graph classification tasks demonstrate the effectiveness of the proposed methods in comparison with the state of the art.



中文翻译:

具有基于节点转移概率的消息传递和DropNode正则化的图卷积神经网络

图卷积神经网络(GCNN)由于具有处理图结构化数据的能力而受到了广泛的关注。在现有的GCNN中,可以将许多方法视为神经信息传递主题的实例。节点的特征在其邻居周围传递,聚合和转换以产生更好的节点表示。但是,这些方法很少使用节点转移概率,该方法已发现对探索图形很有用。此外,当使用转移概率时,在特征聚合步骤中常常不恰当地考虑了它们的转移方向,从而导致低效的加权方案。另外,尽管引入了许多具有越来越高的复杂度的GCNN模型,当在小图上训练时,GCNN通常会遭受过度拟合的困扰。GCNN的另一个问题是过度平滑,这往往使节点的表示难以区分。这项工作提出了一种新方法,可以通过适当考虑过渡方向来改进基于节点过渡概率的消息传递过程,从而导致与现有对应方法相比,在节点特征聚合中具有更好的加权方案。此外,我们提出了一种新颖的正则化方法,称为 与现有副本相比,在节点的特征聚合中导致了更好的加权方案。此外,我们提出了一种新颖的正则化方法,称为 与现有副本相比,在节点的特征聚合中导致了更好的加权方案。此外,我们提出了一种新颖的正则化方法,称为DropNode可同时解决过度拟合和过度平滑的问题。DropNode随机丢弃图的一部分,因此它会创建图的多个变形版本,从而导致数据增强规则化效果。此外,DropNode减少了图的连接性,从而减轻了深度GCNN中过度平滑的影响。在针对节点和图形分类任务的八个基准数据集上的大量实验证明了与现有技术相比,所提出方法的有效性。

更新日期:2021-03-04
down
wechat
bug