当前位置: X-MOL 学术Neurocomputing › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
DCSR: Deep Clustering under Similarity and Reconstruction Constraints
Neurocomputing ( IF 6 ) Pub Date : 2020-10-01 , DOI: 10.1016/j.neucom.2020.06.013
Lei Yu , Wei Wang

Abstract Clustering is a difficult but crucial task in pattern recognition and machine learning. Inherently, clustering methods are always subject to the uncertainty of similarities between samples. To weaken the impact of such uncertainty, we develop Deep Clustering with both Adaptive Siamese Loss (ASL) and ReConstruction Loss (RCL) to adaptively consider the similarities and stabilize the clustering process. Technically, ASL is focus on mapping the samples from the data space to the same representations or the orthogonal representations, and RCL provides a priori knowledge to stabilize the clustering process. Benefiting from such artful modelling, DCSR is in a position to endow deep networks to stably learn one-hot representations, yielding an end-to-end mechanism for clustering and representation learning. Extensive experiments demonstrate that our model achieves state-of-the-art performance on popular datasets, including image, audio, and text.

中文翻译:

DCSR:相似性和重构约束下的深度聚类

摘要 聚类是模式识别和机器学习中一项困难但至关重要的任务。本质上,聚类方法总是受到样本之间相似性的不确定性的影响。为了削弱这种不确定性的影响,我们开发了具有自适应连体损失 (ASL) 和重建损失 (RCL) 的深度聚类,以自适应地考虑相似性并稳定聚类过程。从技术上讲,ASL 专注于将样本从数据空间映射到相同的表示或正交表示,而 RCL 提供先验知识来稳定聚类过程。受益于这种巧妙的建模,DCSR 能够赋予深度网络稳定学习 one-hot 表征的能力,从而产生一种端到端的聚类和表征学习机制。
更新日期:2020-10-01
down
wechat
bug