当前位置: X-MOL 学术Inf. Process. Manag. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Discriminative dual-stream deep hashing for large-scale image retrieval
Information Processing & Management ( IF 7.4 ) Pub Date : 2020-06-20 , DOI: 10.1016/j.ipm.2020.102288
Yujuan Ding , Wai Keung Wong , Zhihui Lai , Zheng Zhang

Deep hashing has been an important research topic for using deep learning to boost performance of hash learning. Most existing deep supervised hashing methods mainly focus on how to effectively preserve the similarity in hash coding solely depending on pairwise supervision. However, such pairwise similarity-preserving strategy cannot fully explore the semantic information in most cases, which results in information loss. To address this problem, this paper proposes a discriminative dual-stream deep hashing (DDDH) method, which integrates the pairwise similarity loss and the classification loss into a unified framework to take full advantage of label information. Specifically, the pairwise similarity loss aims to preserve the similarity and structural information of high-dimensional original data. Meanwhile, the designed classification loss can enlarge the margin between different classes which improves the discrimination of learned binary codes. Moreover, an effective optimization algorithm is employed to train the hash code learning framework in an end-to-end manner. The results of extensive experiments on three image datasets demonstrate that our method is superior to several state-of-the-art deep and non-deep hashing methods. Ablation studies and analysis further show the effectiveness of introducing the classification loss in the overall hash learning framework.



中文翻译:

区分性双流深度哈希用于大规模图像检索

深度哈希已成为使用深度学习提高哈希学习性能的重要研究课题。大多数现有的深度监督哈希方法主要集中在如何仅依靠成对监督来有效地保持哈希编码中的相似性。但是,在大多数情况下,这种成对的相似性保存策略不能充分探究语义信息,从而导致信息丢失。为了解决这个问题,本文提出了一种判别式双流深哈希(DDDH)方法,该方法将成对相似度损失和分类损失整合到一个统一的框架中,以充分利用标签信息。具体而言,成对相似度损失的目的是保留高维原始数据的相似度和结构信息。与此同时,设计的分类损失可以扩大不同类别之间的余量,从而改善对学习的二进制代码的辨别力。此外,采用有效的优化算法以端到端的方式训练哈希码学习框架。在三个图像数据集上进行的广泛实验的结果表明,我们的方法优于几种最新的深度和非深度哈希方法。消融研究和分析进一步显示了在整个哈希学习框架中引入分类损失的有效性。在三个图像数据集上进行的广泛实验的结果表明,我们的方法优于几种最新的深度和非深度哈希方法。消融研究和分析进一步显示了在整个哈希学习框架中引入分类损失的有效性。在三个图像数据集上进行的广泛实验的结果表明,我们的方法优于几种最新的深度和非深度哈希方法。消融研究和分析进一步显示了在整个哈希学习框架中引入分类损失的有效性。

更新日期:2020-06-23
down
wechat
bug