当前位置: X-MOL 学术IEEE Trans. Pattern Anal. Mach. Intell. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Revisiting Unsupervised Meta-Learning via the Characteristics of Few-Shot Tasks
IEEE Transactions on Pattern Analysis and Machine Intelligence ( IF 20.8 ) Pub Date : 6-1-2022 , DOI: 10.1109/tpami.2022.3179368
Han-Jia Ye 1 , Lu Han 1 , De-Chuan Zhan 1
Affiliation  

Meta-learning has become a practical approach towards few-shot image classification, where “a strategy to learn a classifier” is meta-learned on labeled base classes and can be applied to tasks with novel classes. We remove the requirement of base class labels and learn generalizable embeddings via Unsupervised Meta-Learning (UML). Specifically, episodes of tasks are constructed with data augmentations from unlabeled base classes during meta-training, and we apply embedding-based classifiers to novel tasks with labeled few-shot examples during meta-test. We observe two elements play important roles in UML, i.e., the way to sample tasks and measure similarities between instances. Thus we obtain a strong baseline with two simple modifications — a sufficient sampling strategy constructing multiple tasks per episode efficiently together with a semi-normalized similarity. We then take advantage of the characteristics of tasks from two directions to get further improvements. First, synthesized confusing instances are incorporated to help extract more discriminative embeddings. Second, we utilize an additional task-specific embedding transformation as an auxiliary component during meta-training to promote the generalization ability of the pre-adapted embeddings. Experiments on few-shot learning benchmarks verify that our approaches outperform previous UML methods and achieve comparable or even better performance than its supervised variants.

中文翻译:


通过少样本任务的特点重新审视无监督元学习



元学习已经成为少镜头图像分类的一种实用方法,其中“学习分类器的策略”是在标记的基类上进行元学习的,并且可以应用于具有新类的任务。我们消除了基类标签的要求,并通过无监督元学习(UML)学习可泛化的嵌入。具体来说,任务的片段是在元训练期间通过来自未标记基类的数据增强来构建的,并且我们在元测试期间将基于嵌入的分类器应用于具有标记的少样本示例的新任务。我们观察到两个元素在 UML 中发挥着重要作用,即任务采样和测量实例之间相似性的方法。因此,我们通过两个简单的修改获得了一个强大的基线——一个足够的采样策略,有效地构建每个情节的多个任务以及半归一化的相似性。然后我们从两个方向利用任务的特点来得到进一步的改进。首先,合并合成的令人困惑的实例,以帮助提取更具辨别力的嵌入。其次,我们在元训练期间利用额外的特定于任务的嵌入转换作为辅助组件,以提高预适应嵌入的泛化能力。对小样本学习基准的实验验证了我们的方法优于以前的 UML 方法,并实现了与其监督变体相当甚至更好的性能。
更新日期:2024-08-28
down
wechat
bug