当前位置: X-MOL 学术IEEE T. Evolut. Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Evolutionary Neural Architecture Search for High-Dimensional Skip-Connection Structures on DenseNet Style Networks
IEEE Transactions on Evolutionary Computation ( IF 11.7 ) Pub Date : 2021-05-24 , DOI: 10.1109/tevc.2021.3083315
Damien OrNeill , Bing Xue , Mengjie Zhang

Convolutional neural networks hold state-of-the-art results for image classification, and many neural architecture search algorithms have been proposed to discover high performance convolutional neural networks. However, the use of neural architecture search for the discovery of skip-connection structures, an important element in modern convolutional neural networks, is limited within the literature. Furthermore, while many neural architecture search algorithms utilize performance estimation techniques to reduce computation time, empirical evaluations of these performance estimation techniques remain limited. This work focuses on utilizing evolutionary neural architecture search to examine the search space of networks, which follow a fundamental DenseNet structure, but have no fixed skip connections. In particular, a genetic algorithm is designed, which searches the space consisting of all networks between a standard feedforward network and the corresponding DenseNet. To design the algorithm, lower fidelity performance estimation of this class of networks is examined and presented. The final algorithm finds networks that are more accurate than DenseNets on CIFAR10 and CIFAR100, and have fewer trainable parameters. The structures found by the algorithm are examined to shed light on the importance of different types of skip-connection structures in convolutional neural networks, including the discovery of a simple skip-connection removal, which improves DenseNet performance on CIFAR10.

中文翻译:


DenseNet 风格网络上高维跳跃连接结构的进化神经架构搜索



卷积神经网络在图像分类方面拥有最先进的结果,并且已经提出了许多神经架构搜索算法来发现高性能卷积神经网络。然而,使用神经架构搜索来发现跳跃连接结构(现代卷积神经网络的重要元素)在文献中受到限制。此外,虽然许多神经架构搜索算法利用性能估计技术来减少计算时间,但这些性能估计技术的经验评估仍然有限。这项工作的重点是利用进化神经架构搜索来检查网络的搜索空间,该空间遵循基本的 DenseNet 结构,但没有固定的跳跃连接。特别是,设计了一种遗传算法,该算法在标准前馈网络和相应的 DenseNet 之间搜索由所有网络组成的空间。为了设计算法,检查并提出了此类网络的较低保真度性能估计。最终算法发现网络在 CIFAR10 和 CIFAR100 上比 DenseNet 更准确,并且可训练参数更少。对算法发现的结构进行了检查,以阐明卷积神经网络中不同类型的跳跃连接结构的重要性,包括发现简单的跳跃连接删除,从而提高了 DenseNet 在 CIFAR10 上的性能。
更新日期:2021-05-24
down
wechat
bug