当前位置: X-MOL 学术IEEE Trans. Signal Process. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Stability Properties of Graph Neural Networks
IEEE Transactions on Signal Processing ( IF 5.4 ) Pub Date : 2020-01-01 , DOI: 10.1109/tsp.2020.3026980
Fernando Gama , Joan Bruna , Alejandro Ribeiro

Graph neural networks (GNNs) have emerged as a powerful tool for nonlinear processing of graph signals, exhibiting success in recommender systems, power outage prediction, and motion planning, among others. GNNs consist of a cascade of layers, each of which applies a graph convolution, followed by a pointwise nonlinearity. In this work, we study the impact that changes in the underlying topology have on the output of the GNN. First, we show that GNNs are permutation equivariant, which implies that they effectively exploit internal symmetries of the underlying topology. Then, we prove that graph convolutions with integral Lipschitz filters, in combination with the frequency mixing effect of the corresponding nonlinearities, yields an architecture that is both stable to small changes in the underlying topology, and discriminative of information located at high frequencies. These are two properties that cannot simultaneously hold when using only linear graph filters, which are either discriminative or stable, thus explaining the superior performance of GNNs.

中文翻译:

图神经网络的稳定性特性

图神经网络 (GNN) 已成为图信号非线性处理的强大工具,在推荐系统、断电预测和运动规划等方面取得了成功。GNN 由级联层组成,每一层应用图卷积,然后是逐点非线性。在这项工作中,我们研究了底层拓扑的变化对 GNN 输出的影响。首先,我们证明 GNN 是置换等变的,这意味着它们有效地利用了底层拓扑的内部对称性。然后,我们证明了带有积分 Lipschitz 滤波器的图卷积,结合相应非线性的混频效应,产生了一种对底层拓扑的微小变化都稳定的架构,和辨别位于高频的信息。这两个属性在仅使用线性图过滤器时无法同时保持,它们要么是有区别的,要么是稳定的,从而解释了 GNN 的优越性能。
更新日期:2020-01-01
down
wechat
bug