当前位置: X-MOL 学术Pattern Recogn. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Discretization-aware architecture search
Pattern Recognition ( IF 7.5 ) Pub Date : 2021-07-22 , DOI: 10.1016/j.patcog.2021.108186
Yunjie Tian 1 , Chang Liu 1 , Lingxi Xie 2 , Jianbin jiao 1 , Qixiang Ye 1
Affiliation  

The search cost of neural architecture search (NAS) has been largely reduced by differentiable architecture search and weight-sharing methods. Such methods optimize a super-network with all possible edges and operations, and determine the optimal sub-network by discretization, i.e., pruning off operations/edges of small weights. However, the discretization process performed on either operations or edges incurs significant inaccuracy and thus the quality of the architecture is not guaranteed. In this paper, we propose discretization-aware architecture search (DA2S), and target at pushing the super-network towards the configuration of desired topology. DA2S is implemented with an entropy-based loss term, which can be regularized to differentiable architecture search in a plug-and-play fashion. The regularization is controlled by elaborated continuation functions, so that discretization is adaptive to the dynamic change of edges and operations. Experiments on standard image classification benchmarks demonstrate the effectiveness of our approach, in particular, under imbalanced network configurations that were not studied before. Code is available at github.com/sunsmarterjie/DAAS.



中文翻译:

离散化感知架构搜索

可微架构搜索和权重共享方法大大降低了神经架构搜索 (NAS) 的搜索成本。此类方法优化具有所有可能边和操作的超级网络,并通过离散化确定最佳子网络,剪除小权重的操作/边。然而,在操作或边上执行的离散化过程会导致显着的不准确性,因此无法保证架构的质量。在本文中,我们提出了离散化感知架构搜索(DA 2 S),旨在将超级网络推向所需拓扑的配置。DA 2S 是使用基于熵的损失项实现的,它可以以即插即用的方式正则化为可微架构搜索。正则化由精心设计的延续函数控制,使离散化适应边缘和操作的动态变化。在标准图像分类基准上的实验证明了我们方法的有效性,特别是在以前没有研究过的不平衡网络配置下。代码可从 github.com/sunsmarterjie/DAAS 获得。

更新日期:2021-07-28
down
wechat
bug