当前位置: X-MOL 学术Front. Comput. Neurosci. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Unsupervised Few-Shot Feature Learning via Self-Supervised Training
Frontiers in Computational Neuroscience ( IF 3.2 ) Pub Date : 2020-10-14 , DOI: 10.3389/fncom.2020.00083
Zilong Ji , Xiaolong Zou , Tiejun Huang , Si Wu

Learning from limited exemplars (few-shot learning) is a fundamental, unsolved problem that has been laboriously explored in the machine learning community. However, current few-shot learners are mostly supervised and rely heavily on a large amount of labeled examples. Unsupervised learning is a more natural procedure for cognitive mammals and has produced promising results in many machine learning tasks. In this paper, we propose an unsupervised feature learning method for few-shot learning. The proposed model consists of two alternate processes, progressive clustering and episodic training. The former generates pseudo-labeled training examples for constructing episodic tasks; and the later trains the few-shot learner using the generated episodic tasks which further optimizes the feature representations of data. The two processes facilitate each other, and eventually produce a high quality few-shot learner. In our experiments, our model achieves good generalization performance in a variety of downstream few-shot learning tasks on Omniglot and MiniImageNet. We also construct a new few-shot person re-identification dataset FS-Market1501 to demonstrate the feasibility of our model to a real-world application.

中文翻译:

通过自我监督训练的无监督少镜头特征学习

从有限样本中学习(小样本学习)是一个基本的、未解决的问题,机器学习社区一直在努力探索这个问题。然而,目前的小样本学习器大多是受监督的,并且严重依赖于大量的标记示例。无监督学习对于认知哺乳动物来说是一种更自然的过程,并在许多机器学习任务中产生了可喜的结果。在本文中,我们提出了一种用于小样本学习的无监督特征学习方法。所提出的模型由两个交替过程组成,渐进聚类和情节训练。前者生成用于构建情节任务的伪标记训练示例;后者使用生成的情节任务训练小样本学习器,进一步优化数据的特征表示。这两个过程相互促进,最终产生高质量的小样本学习器。在我们的实验中,我们的模型在 Omniglot 和 MiniImageNet 上的各种下游小样本学习任务中取得了良好的泛化性能。我们还构建了一个新的少数人重新识别数据集 FS-Market1501,以证明我们的模型在实际应用中的可行性。
更新日期:2020-10-14
down
wechat
bug