当前位置: X-MOL 学术arXiv.cs.CV › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Breaking the Curse of Space Explosion: Towards Efficient NAS with Curriculum Search
arXiv - CS - Computer Vision and Pattern Recognition Pub Date : 2020-07-07 , DOI: arxiv-2007.07197
Yong Guo, Yaofo Chen, Yin Zheng, Peilin Zhao, Jian Chen, Junzhou Huang, Mingkui Tan

Neural architecture search (NAS) has become an important approach to automatically find effective architectures. To cover all possible good architectures, we need to search in an extremely large search space with billions of candidate architectures. More critically, given a large search space, we may face a very challenging issue of space explosion. However, due to the limitation of computational resources, we can only sample a very small proportion of the architectures, which provides insufficient information for the training. As a result, existing methods may often produce suboptimal architectures. To alleviate this issue, we propose a curriculum search method that starts from a small search space and gradually incorporates the learned knowledge to guide the search in a large space. With the proposed search strategy, our Curriculum Neural Architecture Search (CNAS) method significantly improves the search efficiency and finds better architectures than existing NAS methods. Extensive experiments on CIFAR-10 and ImageNet demonstrate the effectiveness of the proposed method.

中文翻译:

打破太空爆炸的诅咒:通过课程搜索实现高效的 NAS

神经架构搜索(NAS)已成为自动寻找有效架构的重要方法。为了覆盖所有可能的好架构,我们需要在具有数十亿个候选架构的超大搜索空间中进行搜索。更关键的是,鉴于搜索空间很大,我们可能会面临空间爆炸这一极具挑战性的问题。但是,由于计算资源的限制,我们只能对很小一部分架构进行采样,这为训练提供的信息不足。因此,现有的方法可能经常产生次优的架构。为了缓解这个问题,我们提出了一种课程搜索方法,从一个小的搜索空间开始,逐渐结合所学知识来指导大空间的搜索。使用建议的搜索策略,我们的课程神经架构搜索 (CNAS) 方法显着提高了搜索效率,并找到了比现有 NAS 方法更好的架构。在 CIFAR-10 和 ImageNet 上的大量实验证明了所提出方法的有效性。
更新日期:2020-08-06
down
wechat
bug