当前位置: X-MOL 学术arXiv.cs.NE › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Evolutionary Architecture Search for Graph Neural Networks
arXiv - CS - Neural and Evolutionary Computing Pub Date : 2020-09-21 , DOI: arxiv-2009.10199
Min Shi, David A.Wilson, Xingquan Zhu, Yu Huang, Yuan Zhuang, Jianxun Liu and Yufei Tang

Automated machine learning (AutoML) has seen a resurgence in interest with the boom of deep learning over the past decade. In particular, Neural Architecture Search (NAS) has seen significant attention throughout the AutoML research community, and has pushed forward the state-of-the-art in a number of neural models to address grid-like data such as texts and images. However, very litter work has been done about Graph Neural Networks (GNN) learning on unstructured network data. Given the huge number of choices and combinations of components such as aggregator and activation function, determining the suitable GNN structure for a specific problem normally necessitates tremendous expert knowledge and laborious trails. In addition, the slight variation of hyper parameters such as learning rate and dropout rate could dramatically hurt the learning capacity of GNN. In this paper, we propose a novel AutoML framework through the evolution of individual models in a large GNN architecture space involving both neural structures and learning parameters. Instead of optimizing only the model structures with fixed parameter settings as existing work, an alternating evolution process is performed between GNN structures and learning parameters to dynamically find the best fit of each other. To the best of our knowledge, this is the first work to introduce and evaluate evolutionary architecture search for GNN models. Experiments and validations demonstrate that evolutionary NAS is capable of matching existing state-of-the-art reinforcement learning approaches for both the semi-supervised transductive and inductive node representation learning and classification.

中文翻译:

图神经网络的进化架构搜索

在过去十年中,随着深度学习的蓬勃发展,自动化机器学习 (AutoML) 重新引起了人们的兴趣。特别是,神经架构搜索 (NAS) 在整个 AutoML 研究社区中引起了极大的关注,并推动了许多神经模型的最新技术,以处理文本和图像等网格状数据。然而,关于图神经网络(GNN)在非结构化网络数据上学习的工作很少。考虑到聚合器和激活函数等组件的大量选择和组合,为特定问题确定合适的 GNN 结构通常需要大量的专业知识和艰苦的探索。此外,学习率和辍学率等超参数的轻微变化可能会极大地损害 GNN 的学习能力。在本文中,我们通过涉及神经结构和学习参数的大型 GNN 架构空间中单个模型的演化,提出了一种新的 AutoML 框架。不是像现有工作那样只优化具有固定参数设置的模型结构,而是在 GNN 结构和学习参数之间执行交替进化过程,以动态地找到彼此的最佳拟合。据我们所知,这是第一项介绍和评估 GNN 模型的进化架构搜索的工作。
更新日期:2020-09-23
down
wechat
bug