当前位置: X-MOL 学术IEEE Trans. Pattern Anal. Mach. Intell. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
An Analysis of Super-Net Heuristics in Weight-Sharing NAS.
IEEE Transactions on Pattern Analysis and Machine Intelligence ( IF 20.8 ) Pub Date : 2022-10-04 , DOI: 10.1109/tpami.2021.3108480
Kaicheng Yu , Rene Ranftl , Mathieu Salzmann

Weight sharing promises to make neural architecture search (NAS) tractable even on commodity hardware. Existing methods in this space rely on a diverse set of heuristics to design and train the shared-weight backbone network, a.k.a. the super-net. Since heuristics substantially vary across different methods and have not been carefully studied, it is unclear to which extent they impact super-net training and hence the weight-sharing NAS algorithms. In this paper, we disentangle super-net training from the search algorithm, isolate 14 frequently-used training heuristics, and evaluate them over three benchmark search spaces. Our analysis uncovers that several commonly-used heuristics negatively impact the correlation between super-net and stand-alone performance, whereas simple, but often overlooked factors, such as proper hyper-parameter settings, are key to achieve strong performance. Equipped with this knowledge, we show that simple random search achieves competitive performance to complex state-of-the-art NAS algorithms when the super-net is properly trained.

中文翻译:

权重共享 NAS 中的超网启发式分析。

权重共享有望使神经架构搜索 (NAS) 即使在商用硬件上也易于处理。该领域的现有方法依赖于多种启发式方法来设计和训练共享权重骨干网络,也就是超级网络。由于启发式算法在不同方法之间存在很大差异并且尚未得到仔细研究,因此尚不清楚它们在多大程度上影响超级网络训练以及权重共享 NAS 算法。在本文中,我们将超网训练从搜索算法中分离出来,分离出 14 种常用的训练启发式算法,并在三个基准搜索空间上对其进行评估。我们的分析表明,几种常用的启发式方法会对超级网络和独立性能之间的相关性产生负面影响,而简单但经常被忽视的因素,例如适当的超参数设置,是实现强劲性能的关键。有了这些知识,我们表明,当超级网络得到适当训练时,简单的随机搜索可以获得与复杂的最先进 NAS 算法相媲美的性能。
更新日期:2021-08-30
down
wechat
bug