当前位置: X-MOL 学术arXiv.cs.AI › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
HardCoRe-NAS: Hard Constrained diffeRentiable Neural Architecture Search
arXiv - CS - Artificial Intelligence Pub Date : 2021-02-23 , DOI: arxiv-2102.11646
Niv Nayman, Yonathan Aflalo, Asaf Noy, Lihi Zelnik-Manor

Realistic use of neural networks often requires adhering to multiple constraints on latency, energy and memory among others. A popular approach to find fitting networks is through constrained Neural Architecture Search (NAS), however, previous methods enforce the constraint only softly. Therefore, the resulting networks do not exactly adhere to the resource constraint and their accuracy is harmed. In this work we resolve this by introducing Hard Constrained diffeRentiable NAS (HardCoRe-NAS), that is based on an accurate formulation of the expected resource requirement and a scalable search method that satisfies the hard constraint throughout the search. Our experiments show that HardCoRe-NAS generates state-of-the-art architectures, surpassing other NAS methods, while strictly satisfying the hard resource constraints without any tuning required.

中文翻译:

HardCoRe-NAS:硬约束的不同神经结构搜索

实际使用神经网络通常需要遵守对延迟,能量和内存等方面的多重约束。查找约束网络的一种流行方法是通过受约束的神经体系结构搜索(NAS),但是,以前的方法只能轻柔地强制执行约束。因此,最终的网络不能完全遵守资源约束,并且会损害其准确性。在这项工作中,我们通过引入硬约束可区分NAS(HardCoRe-NAS)来解决此问题,该NAS基于对预期资源需求的精确表述和可满足整个搜索中的硬约束的可扩展搜索方法。我们的实验表明,HardCoRe-NAS生成了最先进的架构,超越了其他NAS方法,同时严格满足了硬资源约束,而无需进行任何调整。
更新日期:2021-02-24
down
wechat
bug