当前位置: X-MOL 学术Neural Netw. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Relation-Guided Representation Learning.
Neural Networks ( IF 6.0 ) Pub Date : 2020-07-31 , DOI: 10.1016/j.neunet.2020.07.014
Zhao Kang 1 , Xiao Lu 2 , Jian Liang 3 , Kun Bai 3 , Zenglin Xu 4
Affiliation  

Deep auto-encoders (DAEs) have achieved great success in learning data representations via the powerful representability of neural networks. But most DAEs only focus on the most dominant structures which are able to reconstruct the data from a latent space and neglect rich latent structural information. In this work, we propose a new representation learning method that explicitly models and leverages sample relations, which in turn is used as supervision to guide the representation learning. Different from previous work, our framework well preserves the relations between samples. Since the prediction of pairwise relations themselves is a fundamental problem, our model adaptively learns them from data. This provides much flexibility to encode real data manifold. The important role of relation and representation learning is evaluated on the clustering task. Extensive experiments on benchmark data sets demonstrate the superiority of our approach. By seeking to embed samples into subspace, we further show that our method can address the large-scale and out-of-sample problem. Our source code is publicly available at: https://github.com/nbShawnLu/RGRL.



中文翻译:

关系指导的表示学习。

深度自动编码器(DAE)通过神经网络的强大可表示性在学习数据表示方面取得了巨大成功。但是大多数DAE仅关注最主要的结构,这些结构能够从潜在空间重建数据并忽略丰富的潜在结构信息。在这项工作中,我们提出了一种新的表示学习方法,该方法可以显式地建模和利用样本关系,进而用作指导以指导表示学习的方法。与以前的工作不同,我们的框架很好地保留了样本之间的关系。由于成对关系本身的预测是一个基本问题,因此我们的模型从数据中自适应地学习它们。这为编码实际数据流形提供了很大的灵活性。关系和表示学习的重要作用在聚类任务中进行了评估。在基准数据集上进行的大量实验证明了我们方法的优越性。通过寻求将样本嵌入子空间,我们进一步证明了我们的方法可以解决大规模和样本外问题。我们的源代码可在以下位置公开获得:https://github.com/nbShawnLu/RGRL。

更新日期:2020-08-04
down
wechat
bug