当前位置: X-MOL 学术Neural Netw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Efficient network architecture search via multiobjective particle swarm optimization based on decomposition.
Neural Networks ( IF 7.8 ) Pub Date : 2019-12-16 , DOI: 10.1016/j.neunet.2019.12.005
Jing Jiang 1 , Fei Han 1 , Qinghua Ling 2 , Jie Wang 3 , Tiange Li 4 , Henry Han 4
Affiliation  

The efforts devoted to manually increasing the width and depth of convolutional neural network (CNN) usually require a large amount of time and expertise. It has stimulated a rising demand of neural architecture search (NAS) over these years. However, most popular NAS approaches solely optimize for low prediction error without penalizing high structure complexity. To this end, this paper proposes MOPSO/D-Net, a CNN architecture search method with multiobjective particle swarm optimization based on decomposition (MOPSO/D). The main goal is to reformulate NAS as a multiobjective evolutionary optimization problem, where the optimal architecture is learned by minimizing two conflicting objectives, namely the error rate of classification and number of parameters of the network. Along with the hybrid binary encoding and adaptive penalty-based boundary intersection, an improved MOPSO/D is further proposed to solve the formulated multiobjective NAS and provide diverse tradeoff solutions. Experimental studies verify the effectiveness of MOPSO/D-Net compared with current manual and automated CNN generation methods. The proposed algorithm achieves impressive classification performance with a small number of parameters on each of two benchmark datasets, particularly, 0.4% error rate with 0.16M params on MNIST and 5.88% error rate with 8.1M params on CIFAR-10, respectively.

中文翻译:

通过基于分解的多目标粒子群算法进行高效的网络架构搜索。

致力于手动增加卷积神经网络(CNN)的宽度和深度的工作通常需要大量的时间和专业知识。近年来,它刺激了对神经体系结构搜索(NAS)的不断增长的需求。但是,大多数流行的NAS方法仅针对低预测误差进行优化,而不会损害较高的结构复杂性。为此,本文提出了一种基于分解的多目标粒子群优化的CNN架构搜索方法MOPSO / D-Net(MOPSO / D)。主要目标是将NAS重新格式化为多目标进化优化问题,在该问题中,通过最小化两个相互冲突的目标(即分类的错误率和网络参数的数量)来学习最佳架构。结合混合二进制编码和基于自适应惩罚的边界交集,进一步提出了一种改进的MOPSO / D,以解决制定的多目标NAS并提供多种折衷方案。与目前的手动和自动CNN生成方法相比,实验研究证明了MOPSO / D-Net的有效性。所提出的算法在两个基准数据集的每一个上使用少量参数即可实现令人印象深刻的分类性能,尤其是在MNIST上,0.16M参数的错误率是0.4%,在CIFAR-10上8.1M参数的错误率是5.88%。与目前的手动和自动CNN生成方法相比,实验研究证明了MOPSO / D-Net的有效性。所提出的算法在两个基准数据集的每一个上使用少量参数即可实现令人印象深刻的分类性能,尤其是在MNIST上,0.16M参数的错误率是0.4%,在CIFAR-10上8.1M参数的错误率是5.88%。与目前的手动和自动CNN生成方法相比,实验研究证明了MOPSO / D-Net的有效性。所提出的算法在两个基准数据集的每一个上使用少量参数即可实现令人印象深刻的分类性能,尤其是在MNIST上,0.16M参数的错误率是0.4%,在CIFAR-10上8.1M参数的错误率是5.88%。
更新日期:2019-12-17
down
wechat
bug