当前位置: X-MOL 学术Comput. Intell. Neurosci. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Quadruplet-Based Deep Cross-Modal Hashing
Computational Intelligence and Neuroscience Pub Date : 2021-07-02 , DOI: 10.1155/2021/9968716
Huan Liu 1 , Jiang Xiong 1 , Nian Zhang 2 , Fuming Liu 1 , Xitao Zou 1
Affiliation  

Recently, benefitting from the storage and retrieval efficiency of hashing and the powerful discriminative feature extraction capability of deep neural networks, deep cross-modal hashing retrieval has drawn more and more attention. To preserve the semantic similarities of cross-modal instances during the hash mapping procedure, most existing deep cross-modal hashing methods usually learn deep hashing networks with a pairwise loss or a triplet loss. However, these methods may not fully explore the similarity relation across modalities. To solve this problem, in this paper, we introduce a quadruplet loss into deep cross-modal hashing and propose a quadruplet-based deep cross-modal hashing (termed QDCMH) method. Extensive experiments on two benchmark cross-modal retrieval datasets show that our proposed method achieves state-of-the-art performance and demonstrate the efficiency of the quadruplet loss in cross-modal hashing.

中文翻译:

基于四元组的深度跨模式散列

近年来,得益于哈希的存储和检索效率以及深度神经网络强大的判别特征提取能力,深度跨模态哈希检索越来越受到关注。为了在哈希映射过程中保持跨模态实例的语义相似性,大多数现有的深度跨模态哈希方法通常学习具有成对损失或三元组损失的深度哈希网络。然而,这些方法可能无法充分探索跨模态的相似关系。为了解决这个问题,在本文中,我们在深度跨模态散列中引入了四元组损失,并提出了一种基于四元组的深度跨模态散列(称为 QDCMH)方法。
更新日期:2021-07-02
down
wechat
bug