当前位置: X-MOL 学术IEEE Trans. Pattern Anal. Mach. Intell. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Fast Supervised Discrete Hashing
IEEE Transactions on Pattern Analysis and Machine Intelligence ( IF 23.6 ) Pub Date : 2017-03-07 , DOI: 10.1109/tpami.2017.2678475
Jie Gui , Tongliang Liu , Zhenan Sun , Dacheng Tao , Tieniu Tan

Learning-based hashing algorithms are “hot topics” because they can greatly increase the scale at which existing methods operate. In this paper, we propose a new learning-based hashing method called “fast supervised discrete hashing” (FSDH) based on “supervised discrete hashing” (SDH). Regressing the training examples (or hash code) to the corresponding class labels is widely used in ordinary least squares regression. Rather than adopting this method, FSDH uses a very simple yet effective regression of the class labels of training examples to the corresponding hash code to accelerate the algorithm. To the best of our knowledge, this strategy has not previously been used for hashing. Traditional SDH decomposes the optimization into three sub-problems, with the most critical sub-problem - discrete optimization for binary hash codes - solved using iterative discrete cyclic coordinate descent (DCC), which is time-consuming. However, FSDH has a closed-form solution and only requires a single rather than iterative hash code-solving step, which is highly efficient. Furthermore, FSDH is usually faster than SDH for solving the projection matrix for least squares regression, making FSDH generally faster than SDH. For example, our results show that FSDH is about 12-times faster than SDH when the number of hashing bits is 128 on the CIFAR-10 data base, and FSDH is about 151-times faster than FastHash when the number of hashing bits is 64 on the MNIST data-base. Our experimental results show that FSDH is not only fast, but also outperforms other comparative methods.

中文翻译:

快速监督的离散哈希

基于学习的哈希算法是“热门话题”,因为它们可以大大增加现有方法的运行规模。在本文中,我们基于“监督离散哈希”(SDH)提出了一种新的基于学习的哈希方法,称为“快速监督离散哈希”(FSDH)。在普通最小二乘回归中广泛使用将训练示例(或哈希码)回归到相应的类标签。FSDH不是采用这种方法,而是使用非常简单但有效的训练样本分类标签到相应哈希码的回归来加速算法。据我们所知,此策略以前未用于散列。传统的SDH将优化过程分解为三个子问题,最关键的子问题-二进制哈希码的离散优化-使用迭代离散循环坐标下降(DCC)来解决,这很耗时。但是,FSDH具有封闭形式的解决方案,仅需要单个步骤,而不是迭代式哈希代码求解步骤,因此非常高效。此外,在求解投影矩阵的最小二乘回归时,FSDH通常比SDH快,这使得FSDH通常比SDH快。例如,我们的结果表明,当CIFAR-10数据库上的哈希位数为128时,FSDH比SDH快12倍,而当哈希位数为64时,FSDH比FastHash快151倍。在MNIST数据库上。我们的实验结果表明,FSDH不仅速度快,而且优于其他比较方法。
更新日期:2018-01-09
down
wechat
bug