Recently, it has been observed that {0,1,-1}-ternary codes which are simply generated from deep features by hard thresholding, tend to outperform {-1,1}-binary codes in image retrieval. To obtain better ternary codes, we for the first time propose to jointly learn the features with the codes by appending a smoothed function to the networks. During training, the function could evolve into a non-smoothed ternary function by a continuation method. The method circumvents the difficulty of directly training discrete functions and reduces the quantization errors of ternary codes. Experiments show that the generated codes indeed could achieve higher retrieval accuracy.
翻译:最近,人们发现, {0,1,1} 永久代码仅仅是由硬阈值产生的深层特征产生的,往往在图像检索中优于 {-1,1} 双元代码。 为了获得更好的旧代码, 我们首次提议通过向网络附加一个平滑的功能, 与这些代码共同学习特征。 在培训过程中, 该函数可以通过一种持续方法演变成一个非移动的永久功能。 该方法避免了直接培训离散功能的困难,减少了对永久代码的量化错误。 实验显示, 生成代码确实可以实现更高的检索准确性 。