当前位置: X-MOL 学术IEEE Trans. Signal Process. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Stochastic Graph Neural Networks
IEEE Transactions on Signal Processing ( IF 4.6 ) Pub Date : 2021-06-28 , DOI: 10.1109/tsp.2021.3092336
Zhan Gao , Elvin Isufi , Alejandro Ribeiro

Graph neural networks (GNNs) model nonlinear representations in graph data with applications in distributed agent coordination, control, and planning among others. Current GNN architectures assume ideal scenarios and ignore link fluctuations that occur due to environment, human factors, or external attacks. In these situations, the GNN fails to address its distributed task if the topological randomness is not considered accordingly. To overcome this issue, we put forth the stochastic graph neural network (SGNN) model: a GNN where the distributed graph convolution module accounts for the random network changes. Since stochasticity brings in a new learning paradigm, we conduct a statistical analysis on the SGNN output variance to identify conditions the learned filters should satisfy for achieving robust transference to perturbed scenarios, ultimately revealing the explicit impact of random link losses. We further develop a stochastic gradient descent (SGD) based learning process for the SGNN and derive conditions on the learning rate under which this learning process converges to a stationary point. Numerical results corroborate our theoretical findings and compare the benefits of SGNN robust transference with a conventional GNN that ignores graph perturbations during learning.

中文翻译:


随机图神经网络



图神经网络 (GNN) 对图数据中的非线性表示进行建模,并应用于分布式代理协调、控制和规划等。当前的 GNN 架构假设理想场景,忽略因环境、人为因素或外部攻击而发生的链路波动。在这些情况下,如果不相应地考虑拓扑随机性,GNN 就无法解决其分布式任务。为了克服这个问题,我们提出了随机图神经网络(SGNN)模型:分布式图卷积模块解释随机网络变化的 GNN。由于随机性带来了新的学习范式,我们对 SGNN 输出方差进行统计分析,以确定学习滤波器应满足的条件,以实现对扰动场景的鲁棒迁移,最终揭示随机链路丢失的显式影响。我们进一步为 SGNN 开发基于随机梯度下降 (SGD) 的学习过程,并导出该学习过程收敛到驻点的学习速率条件。数值结果证实了我们的理论发现,并将 SGNN 稳健迁移与忽略学习过程中图扰动的传统 GNN 的优点进行了比较。
更新日期:2021-06-28
down
wechat
bug