当前位置: X-MOL 学术Pattern Recogn. Lett. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
MetAdapt: Meta-learned task-adaptive architecture for few-shot classification
Pattern Recognition Letters ( IF 3.9 ) Pub Date : 2021-06-12 , DOI: 10.1016/j.patrec.2021.05.010
Sivan Doveh , Eli Schwartz , Chao Xue , Rogerio Feris , Alex Bronstein , Raja Giryes , Leonid Karlinsky

Recently, great progress has been made in the field of Few-Shot Learning (FSL). While many different methods have been proposed, one of the key factors leading to higher FSL performance is surprisingly simple. It is the backbone network architecture used to embed the images of the few-shot tasks. While first works on FSL resorted to small architectures with just a few convolution layers, recent works show that large architectures pre-trained on the training portion of FSL datasets produce strong features that are more easily transferable to novel few-shot tasks, thus attaining significant gains to methods using them. Despite these observations, little to no work has been done towards finding the right backbone for FSL. In this paper we propose MetAdapt that not only meta-searches for an optimized architecture for FSL using Network Architecture Search (NAS), but also results in a model that can adaptively ‘re-wire’ itself predicting the better architecture for a given novel few-shot task. Using the proposed approach we observe strong results on two popular few-shot benchmarks: miniImageNet and FC100.



中文翻译:

MetAdapt:元学习任务自适应架构,用于少样本分类

最近,Few-Shot Learning(FSL)领域取得了很大进展。虽然已经提出了许多不同的方法,但导致更高 FSL 性能的关键因素之一非常简单。它是用于嵌入少拍任务图像的骨干网络架构。虽然 FSL 的最初工作采用只有几个卷积层的小型架构,但最近的工作表明,在 FSL 数据集的训练部分预训练的大型架构产生了强大的特征,这些特征更容易转移到新颖的小样本任务中,从而获得显着的使用它们的方法的收益。尽管有这些观察,在为 FSL 寻找合适的主干方面几乎没有做任何工作。在本文中,我们提出了 MetAdapt,它不仅可以使用网络架构搜索 (NAS) 对 FSL 的优化架构进行元搜索,而且还可以生成一个模型,该模型可以自适应地“重新连接”自身,为给定的少数新架构预测更好的架构- 拍摄任务。使用所提出的方法,我们在两个流行的小样本基准测试中观察到了很好的结果:迷你ImageNet 和 FC100。

更新日期:2021-07-02
down
wechat
bug