当前位置: X-MOL 学术arXiv.cs.NE › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Neural Architecture Generator Optimization
arXiv - CS - Neural and Evolutionary Computing Pub Date : 2020-04-03 , DOI: arxiv-2004.01395
Binxin Ru, Pedro Esperanca, Fabio Carlucci

Neural Architecture Search (NAS) was first proposed to achieve state-of-the-art performance through the discovery of new architecture patterns, without human intervention. An over-reliance on expert knowledge in the search space design has however led to increased performance (local optima) without significant architectural breakthroughs, thus preventing truly novel solutions from being reached. In this work we 1) are the first to investigate casting NAS as a problem of finding the optimal network generator and 2) we propose a new, hierarchical and graph-based search space capable of representing an extremely large variety of network types, yet only requiring few continuous hyper-parameters. This greatly reduces the dimensionality of the problem, enabling the effective use of Bayesian Optimisation as a search strategy. At the same time, we expand the range of valid architectures, motivating a multi-objective learning approach. We demonstrate the effectiveness of this strategy on six benchmark datasets and show that our search space generates extremely lightweight yet highly competitive models.

中文翻译:

神经架构生成器优化

神经架构搜索 (NAS) 最初被提出是为了通过发现新的架构模式来实现最先进的性能,而无需人工干预。然而,在搜索空间设计中过度依赖专家知识导致性能提高(局部最优)而没有显着的架构突破,从而阻碍了真正新颖的解决方案的实现。在这项工作中,我们 1) 第一个研究将 NAS 作为寻找最佳网络生成器的问题,2) 我们提出了一个新的、分层的和基于图的搜索空间,能够表示极其多种网络类型,但仅需要很少的连续超参数。这大大降低了问题的维度,从而能够有效地使用贝叶斯优化作为搜索策略。同时,我们扩展了有效架构的范围,激发了多目标学习方法。我们在六个基准数据集上证明了该策略的有效性,并表明我们的搜索空间生成了极其轻量级但极具竞争力的模型。
更新日期:2020-10-09
down
wechat
bug