当前位置: X-MOL 学术IEEE Comput. Intell. Mag. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Fast and Unsupervised Neural Architecture Evolution for Visual Representation Learning
IEEE Computational Intelligence Magazine ( IF 9 ) Pub Date : 2021-07-21 , DOI: 10.1109/mci.2021.3084394
Song Xue , Hanlin Chen , Chunyu Xie , Baochang Zhang , Xuan Gong , David Doermann

Unsupervised visual representation learning is one of the hottest topics in computer vision, yet performance still lags behind compared with the best supervised learning methods. At the same time, neural architecture search (NAS) has produced state-of-the-art results on various visual tasks. It is a natural idea to explore NAS as a way to improve unsupervised representation learning, yet it remains largely unexplored. In this paper, we propose a Fast and Unsupervised Neural Architecture Evolution (FaUNAE) method to evolve an existing architecture, manually constructed or the result of NAS on a small dataset, to a new architecture that can operate on a larger dataset. This partial optimization can utilize prior knowledge to reduce search cost and improve search efficiency. The evolution is self-supervised where the contrast loss is used as the evaluation metric in a student-teacher framework. By eliminating the inferior or least promising operations, the evolutionary process is greatly accelerated. Experimental results show that we achieve state-of-the-art performance for downstream applications, such as object recognition, object detection, and instance segmentation.

中文翻译:

用于视觉表示学习的快速无监督神经架构演化

无监督视觉表示学习是计算机视觉中最热门的话题之一,但与最好的监督学习方法相比,其性能仍然落后。同时,神经架构搜索 (NAS) 已在各种视觉任务上产生了最先进的结果。探索 NAS 作为改进无监督表示学习的一种方式是很自然的想法,但它在很大程度上仍未被探索。在本文中,我们提出了一种快速无监督神经架构演化 (FaUNAE) 方法,将现有架构(手动构建的或 NAS 在小数据集上的结果)演化为可以在更大数据集上运行的新架构。这种局部优化可以利用先验知识来降低搜索成本,提高搜索效率。进化是自我监督的,其中对比度损失被用作学生-教师框架中的评估指标。通过消除次等或最没有希望的操作,进化过程大大加速。实验结果表明,我们为下游应用程序实现了最先进的性能,例如对象识别、对象检测和实例分割。
更新日期:2021-09-12
down
wechat
bug