当前位置: X-MOL 学术Int. J. Neural Syst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
One-Shot Neural Architecture Search by Dynamically Pruning Supernet in Hierarchical Order
International Journal of Neural Systems ( IF 8 ) Pub Date : 2021-06-14 , DOI: 10.1142/s0129065721500295
Jianwei Zhang 1 , Dong Li 1 , Lituan Wang 1 , Lei Zhang 1
Affiliation  

Neural Architecture Search (NAS), which aims at automatically designing neural architectures, recently draw a growing research interest. Different from conventional NAS methods, in which a large number of neural architectures need to be trained for evaluation, the one-shot NAS methods only have to train one supernet which synthesizes all the possible candidate architectures. As a result, the search efficiency could be significantly improved by sharing the supernet’s weights during the candidate architectures’ evaluation. This strategy could greatly speed up the search process but suffer a challenge that the evaluation based on sharing weights is not predictive enough. Recently, pruning the supernet during the search has been proven to be an efficient way to alleviate this problem. However, the pruning direction in complex-structured search space remains unexplored. In this paper, we revisited the role of path dropout strategy, which drops the neural operations instead of the neurons, in supernet training, and several interesting characters of the supernet trained with dropout are found. Based on the observations, a Hierarchically-Ordered Pruning Neural Architecture Search (HOPNAS) algorithm is proposed by dynamically pruning the supernet with a proper pruning direction. Experimental results indicate that our method is competitive with state-of-the-art approaches on CIFAR10 and ImageNet.

中文翻译:

通过按层次顺序动态修剪超网的 One-Shot 神经架构搜索

旨在自动设计神经架构的神经架构搜索(NAS)最近引起了越来越多的研究兴趣。与传统的 NAS 方法需要训练大量神经架构进行评估不同,one-shot NAS 方法只需要训练一个超网,该超网综合了所有可能的候选架构。因此,通过在候选架构评估期间共享超网的权重,可以显着提高搜索效率。这种策略可以大大加快搜索过程,但面临一个挑战,即基于共享权重的评估预测性不够。最近,在搜索过程中修剪超网已被证明是缓解此问题的有效方法。然而,复杂结构搜索空间中的修剪方向仍未探索。在本文中,我们重新审视了路径 dropout 策略在超网训练中的作用,即丢弃神经操作而不是神经元,并发现了使用 dropout 训练的超网的几个有趣特征。基于观察,提出了一种分层排序的剪枝神经结构搜索(HOPNAS)算法,通过对超网进行动态剪枝,以合适的剪枝方向进行剪枝。实验结果表明,我们的方法与 CIFAR10 和 ImageNet 上的最新方法具有竞争力。基于观察,提出了一种分层排序的剪枝神经结构搜索(HOPNAS)算法,通过对超网进行动态剪枝,以合适的剪枝方向进行剪枝。实验结果表明,我们的方法与 CIFAR10 和 ImageNet 上的最新方法具有竞争力。基于观察,提出了一种分层排序的剪枝神经结构搜索(HOPNAS)算法,通过对超网进行动态剪枝,以合适的剪枝方向进行剪枝。实验结果表明,我们的方法与 CIFAR10 和 ImageNet 上的最新方法具有竞争力。
更新日期:2021-06-14
down
wechat
bug