当前位置: X-MOL 学术arXiv.cs.NE › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
GOLD-NAS: Gradual, One-Level, Differentiable
arXiv - CS - Neural and Evolutionary Computing Pub Date : 2020-07-07 , DOI: arxiv-2007.03331
Kaifeng Bi, Lingxi Xie, Xin Chen, Longhui Wei, Qi Tian

There has been a large literature of neural architecture search, but most existing work made use of heuristic rules that largely constrained the search flexibility. In this paper, we first relax these manually designed constraints and enlarge the search space to contain more than $10^{160}$ candidates. In the new space, most existing differentiable search methods can fail dramatically. We then propose a novel algorithm named Gradual One-Level Differentiable Neural Architecture Search (GOLD-NAS) which introduces a variable resource constraint to one-level optimization so that the weak operators are gradually pruned out from the super-network. In standard image classification benchmarks, GOLD-NAS can find a series of Pareto-optimal architectures within a single search procedure. Most of the discovered architectures were never studied before, yet they achieve a nice tradeoff between recognition accuracy and model complexity. We believe the new space and search algorithm can advance the search of differentiable NAS.

中文翻译:

GOLD-NAS:渐进的、一级的、可微的

有大量关于神经架构搜索的文献,但大多数现有工作都使用启发式规则,这在很大程度上限制了搜索的灵活性。在本文中,我们首先放宽这些手动设计的约束并扩大搜索空间以包含超过 $10^{160}$ 个候选。在新空间中,大多数现有的可微搜索方法可能会失败。然后,我们提出了一种名为渐进式单级可微神经架构搜索(GOLD-NAS)的新算法,该算法在一级优化中引入了可变资源约束,以便逐渐从超级网络中剔除弱算子。在标准图像分类基准中,GOLD-NAS 可以在单个搜索过程中找到一系列帕累托最优架构。大多数发现的架构以前从未研究过,然而,他们在识别准确性和模型复杂性之间取得了很好的平衡。我们相信新的空间和搜索算法可以推进可微 NAS 的搜索。
更新日期:2020-07-08
down
wechat
bug