当前位置: X-MOL 学术Appl. Soft Comput. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Hidden multi-distance loss-based full-convolution hashing
Applied Soft Computing ( IF 8.7 ) Pub Date : 2021-05-21 , DOI: 10.1016/j.asoc.2021.107508
Mingwen Yuan , Binbin Qin , Jianhao Li , Jiangbo Qian , Yu Xin

To improve retrieval efficiency and quality, learning to hash has been widely used in approximate nearest neighbor queries. Deep learning is characterized by high precision in extracting data features; therefore, deep-learning-based hashing has attracted more attention. Existing methods have some weaknesses, such as complex training and losing spatial information. We design a new deep hashing algorithm named HLFH, which is very simple technique but achieves amazingly good performance. HLFH is optimized and improved in two aspects: network structure and hashing loss. Concerning network structure, a new full convolutional hashing network is proposed to preserve spatial information of features. A smooth activation function is used in the hashing layer to reduce the quantization error. Concerning hashing loss, the semantic information of data is then used to generate binary codes by hidden multi-distance loss, i.e., combination of triplet loss and quadruplet loss. With these two new techniques, our method is more accurate than many other state-of-the-art methods.



中文翻译:

基于隐藏多距离损失的全卷积哈希

为了提高检索效率和质量,学习散列已广泛用于近似最近邻查询。深度学习的特点是提取数据特征精度高;因此,基于深度学习的哈希得到了更多的关注。现有方法存在一些弱点,例如训练复杂和丢失空间信息。我们设计了一种名为 HLFH 的新深度散列算法,该算法非常简单,但性能却非常好。HLFH 在网络结构和哈希损失两个方面进行了优化和改进。关于网络结构,提出了一种新的全卷积哈希网络来保留特征的空间信息。在散列层中使用平滑激活函数来减少量化误差。关于散列损失,然后使用数据的语义信息通过隐藏的多距离损失生成二进制代码,即三重损失和四重损失的组合。使用这两种新技术,我们的方法比许多其他最先进的方法更准确。

更新日期:2021-05-30
down
wechat
bug