当前位置: X-MOL 学术IEEE Trans. Pattern Anal. Mach. Intell. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Unsupervised Deep Hashing with Similarity-Adaptive and Discrete Optimization
IEEE Transactions on Pattern Analysis and Machine Intelligence ( IF 20.8 ) Pub Date : 1-5-2018 , DOI: 10.1109/tpami.2018.2789887
Fumin Shen , Yan Xu , Li Liu , Yang Yang , Zi Huang , Heng Tao Shen

Recent vision and learning studies show that learning compact hash codes can facilitate massive data processing with significantly reduced storage and computation. Particularly, learning deep hash functions has greatly improved the retrieval performance, typically under the semantic supervision. In contrast, current unsupervised deep hashing algorithms can hardly achieve satisfactory performance due to either the relaxed optimization or absence of similarity-sensitive objective. In this work, we propose a simple yet effective unsupervised hashing framework, named Similarity-Adaptive Deep Hashing (SADH), which alternatingly proceeds over three training modules: deep hash model training, similarity graph updating and binary code optimization. The key difference from the widely-used two-step hashing method is that the output representations of the learned deep model help update the similarity graph matrix, which is then used to improve the subsequent code optimization. In addition, for producing high-quality binary codes, we devise an effective discrete optimization algorithm which can directly handle the binary constraints with a general hashing loss. Extensive experiments validate the efficacy of SADH, which consistently outperforms the state-of-the-arts by large gaps.

中文翻译:


具有相似性自适应和离散优化的无监督深度哈希



最近的视觉和学习研究表明,学习紧凑的哈希码可以促进海量数据处理,同时显着减少存储和计算。特别是,学习深度哈希函数极大地提高了检索性能,通常是在语义监督下。相比之下,当前的无监督深度哈希算法由于优化宽松或缺乏相似性敏感目标而很难达到令人满意的性能。在这项工作中,我们提出了一个简单而有效的无监督哈希框架,名为相似度自适应深度哈希(SADH),它交替进行三个训练模块:深度哈希模型训练、相似图更新和二进制代码优化。与广泛使用的两步哈希方法的主要区别在于,学习的深度模型的输出表示有助于更新相似图矩阵,然后用于改进后续的代码优化。此外,为了生成高质量的二进制代码,我们设计了一种有效的离散优化算法,可以直接处理具有一般散列损失的二进制约束。大量实验验证了 SADH 的功效,其性能始终远远优于最先进的技术。
更新日期:2024-08-22
down
wechat
bug