当前位置: X-MOL 学术IEEE Signal Proc. Mag. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph Neural Networks
IEEE Signal Processing Magazine ( IF 9.4 ) Pub Date : 2020-10-29 , DOI: 10.1109/msp.2020.3016143
Fernando Gama , Elvin Isufi , Geert Leus , Alejandro Ribeiro

Network data can be conveniently modeled as a graph signal, where data values are assigned to nodes of a graph that describes the underlying network topology. Successful learning from network data is built upon methods that effectively exploit this graph structure. In this article, we leverage graph signal processing (GSP) to characterize the representation space of graph neural networks (GNNs). We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology. These two properties offer insight about the workings of GNNs and help explain their scalability and transferability properties, which, coupled with their local and distributed nature, make GNNs powerful tools for learning in physical networks. We also introduce GNN extensions using edge-varying and autoregressive moving average (ARMA) graph filters and discuss their properties. Finally, we study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.

中文翻译:


图、卷积和神经网络:从图过滤器到图神经网络



网络数据可以方便地建模为图形信号,其中数据值被分配给描述底层网络拓扑的图形的节点。成功地从网络数据中学习是建立在有效利用这种图结构的方法之上的。在本文中,我们利用图信号处理(GSP)来表征图神经网络(GNN)的表示空间。我们讨论了图卷积滤波器在 GNN 中的作用,并表明使用此类滤波器构建的任何架构都具有排列等方差和对拓扑变化的稳定性的基本属性。这两个属性提供了对 GNN 工作原理的深入了解,并有助于解释它们的可扩展性和可转移性属性,再加上它们的本地和分布式特性,使 GNN 成为物理网络学习的强大工具。我们还介绍了使用边缘变化和自回归移动平均 (ARMA) 图滤波器的 GNN 扩展,并讨论了它们的属性。最后,我们研究了 GNN 在推荐系统中的使用以及学习机器人群的分散控制器。
更新日期:2020-10-29
down
wechat
bug