当前位置: X-MOL 学术arXiv.cs.SI › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Training Graph Neural Networks with 1000 Layers
arXiv - CS - Social and Information Networks Pub Date : 2021-06-14 , DOI: arxiv-2106.07476
Guohao Li, Matthias Müller, Bernard Ghanem, Vladlen Koltun

Deep graph neural networks (GNNs) have achieved excellent results on various tasks on increasingly large graph datasets with millions of nodes and edges. However, memory complexity has become a major obstacle when training deep GNNs for practical applications due to the immense number of nodes, edges, and intermediate activations. To improve the scalability of GNNs, prior works propose smart graph sampling or partitioning strategies to train GNNs with a smaller set of nodes or sub-graphs. In this work, we study reversible connections, group convolutions, weight tying, and equilibrium models to advance the memory and parameter efficiency of GNNs. We find that reversible connections in combination with deep network architectures enable the training of overparameterized GNNs that significantly outperform existing methods on multiple datasets. Our models RevGNN-Deep (1001 layers with 80 channels each) and RevGNN-Wide (448 layers with 224 channels each) were both trained on a single commodity GPU and achieve an ROC-AUC of $87.74 \pm 0.13$ and $88.14 \pm 0.15$ on the ogbn-proteins dataset. To the best of our knowledge, RevGNN-Deep is the deepest GNN in the literature by one order of magnitude. Please visit our project website https://www.deepgcns.org/arch/gnn1000 for more information.

中文翻译:

训练 1000 层的图神经网络

深度图神经网络 (GNN) 在具有数百万个节点和边的越来越大的图数据集上的各种任务上取得了优异的成绩。然而,由于节点、边和中间激活的数量巨大,在为实际应用训练深度 GNN 时,内存复杂性已成为主要障碍。为了提高 GNN 的可扩展性,先前的工作提出了智能图采样或分区策略,以训练具有较小节点或子图集的 GNN。在这项工作中,我们研究了可逆连接、组卷积、权重绑定和均衡模型,以提高 GNN 的内存和参数效率。我们发现可逆连接与深度网络架构相结合,可以训练过度参数化的 GNN,其在​​多个数据集上的性能显着优于现有方法。我们的模型 RevGNN-Deep(1001 层,每个层有 80 个通道)和 RevGNN-Wide(448 层,每个层有 224 个通道)都在单个商品 GPU 上进行了训练,并实现了 87.74 美元 \pm 0.13$ 和 88.14 美元 \pm 0.15 的 ROC-AUC $ 在 ogbn-proteins 数据集上。据我们所知,RevGNN-Deep 是文献中最深的一个数量级的 GNN。请访问我们的项目网站 https://www.deepgcns.org/arch/gnn1000 了解更多信息。
更新日期:2021-06-15
down
wechat
bug