当前位置: X-MOL 学术Comput. Vis. Image Underst. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Deep online classification using pseudo-generative models
Computer Vision and Image Understanding ( IF 4.3 ) Pub Date : 2020-08-08 , DOI: 10.1016/j.cviu.2020.103048
Andrey Besedin , Pierre Blanchart , Michel Crucianu , Marin Ferecatu

In this work we propose a new deep learning based approach for online classification on streams of high-dimensional data. While requiring very little historical data storage, our approach is able to alleviate catastrophic forgetting in the scenario of continual learning with no assumption on the stationarity of the data in the stream. To make up for the absence of historical data, we propose a new generative autoencoder endowed with an auxiliary loss function that ensures fast task-sensitive convergence. To evaluate our approach we perform experiments on two well-known image datasets, MNIST and LSUN, in a continuous streaming mode. We extend the experiments to a large multi-class synthetic dataset that allows to check the performance of our method in more challenging settings with up to 1000 distinct classes. Our approach is able to perform classification on dynamic data streams with an accuracy close to the results obtained in the offline classification setup where all the data are available for the full duration of training. In addition, we demonstrate the ability of our method to adapt to unseen data classes and new instances of already known data categories, while avoiding catastrophic forgetting of previously acquired knowledge.



中文翻译:

使用伪生成模型进行深度在线分类

在这项工作中,我们提出了一种基于深度学习的新方法,用于对高维数据流进行在线分类。尽管只需要很少的历史数据存储,但是我们的方法能够在连续学习的情况下减轻灾难性的遗忘,而无需假设流中数据的平稳性。为了弥补历史数据的不足,我们提出了一种新型的自动编码器,该编码器具有辅助丢失功能,可确保快速实现任务敏感的收敛。为了评估我们的方法,我们以连续流模式对两个著名的图像数据集MNIST和LSUN进行了实验。我们将实验扩展到一个大型的多类综合数据集,该数据集可在更具挑战性的设置中(多达1000个不同的类)检查我们方法的性能。我们的方法能够对动态数据流执行分类,其准确性接近于离线分类设置中获得的结果,在离线分类设置中,所有数据在整个训练期间都是可用的。另外,我们展示了我们的方法能够适应看不见的数据类别和已知数据类别的新实例的能力,同时避免了对先前获得的知识的灾难性遗忘。

更新日期:2020-08-18
down
wechat
bug