当前位置: X-MOL 学术IEEE Trans. Signal Process. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Gated Graph Recurrent Neural Networks
IEEE Transactions on Signal Processing ( IF 4.6 ) Pub Date : 2020-01-01 , DOI: 10.1109/tsp.2020.3033962
Luana Ruiz , Fernando Gama , Alejandro Ribeiro

Graph processes exhibit a temporal structure determined by the sequence index and and a spatial structure determined by the graph support. To learn from graph processes, an information processing architecture must then be able to exploit both underlying structures. We introduce Graph Recurrent Neural Networks (GRNNs) as a general learning framework that achieves this goal by leveraging the notion of a recurrent hidden state together with graph signal processing (GSP). In the GRNN, the number of learnable parameters is independent of the length of the sequence and of the size of the graph, guaranteeing scalability. We prove that GRNNs are permutation equivariant and that they are stable to perturbations of the underlying graph support. To address the problem of vanishing gradients, we also put forward gated GRNNs with three different gating mechanisms: time, node and edge gates. In numerical experiments involving both synthetic and real datasets, time-gated GRNNs are shown to improve upon GRNNs in problems with long term dependencies, while node and edge gates help encode long range dependencies present in the graph. The numerical results also show that GRNNs outperform GNNs and RNNs, highlighting the importance of taking both the temporal and graph structures of a graph process into account.

中文翻译:

门控图递归神经网络

图处理表现出由序列索引确定的时间结构和由图支持确定的空间结构。为了从图处理中学习,信息处理架构必须能够利用这两种底层结构。我们引入了图循环神经网络 (GRNN) 作为通用学习框架,它通过利用循环隐藏状态的概念和图信号处理 (GSP) 来实现这一目标。在 GRNN 中,可学习参数的数量与序列的长度和图的大小无关,保证了可扩展性。我们证明 GRNN 是置换等变的,并且它们对底层图支持的扰动是稳定的。为了解决梯度消失的问题,我们还提出了具有三种不同门控机制的门控 GRNN:时间门、节点门和边门。在涉及合成数据集和真实数据集的数值实验中,时间门控 GRNN 被证明可以在具有长期依赖关系的问题上改进 GRNN,而节点和边门有助于编码图中存在的长期依赖关系。数值结果还表明 GRNN 优于 GNN 和 RNN,突出了考虑图过程的时间和图结构的重要性。
更新日期:2020-01-01
down
wechat
bug