Recently, deep hashing methods have been widely used in image retrieval task. Most existing deep hashing approaches adopt one-to-one quantization to reduce information loss. However, such class-unrelated quantization cannot give discriminative feedback for network training. In addition, these methods only utilize single label to integrate supervision information of data for hashing function learning, which may result in inferior network generalization performance and relatively low-quality hash codes since the inter-class information of data is totally ignored. In this paper, we propose a dual semantic asymmetric hashing (DSAH) method, which generates discriminative hash codes under three-fold constrains. Firstly, DSAH utilizes class prior to conduct class structure quantization so as to transmit class information during the quantization process. Secondly, a simple yet effective label mechanism is designed to characterize both the intra-class compactness and inter-class separability of data, thereby achieving semantic-sensitive binary code learning. Finally, a meaningful pairwise similarity preserving loss is devised to minimize the distances between class-related network outputs based on an affinity graph. With these three main components, high-quality hash codes can be generated through network. Extensive experiments conducted on various datasets demonstrate the superiority of DSAH in comparison with state-of-the-art deep hashing methods.
翻译:最近,在图像检索任务中广泛使用了深度散列方法。大多数现有的深层散列方法都采用了一对一的量化方法来减少信息损失。然而,这种与阶级无关的量化方法不能为网络培训提供歧视性反馈。此外,这些方法只使用单一标签来将数据监督信息整合到仓列功能学习中,这可能导致网络一般化性能低下,而且由于数据分类间信息被完全忽略,因此分类间散列代码质量较低。在本文中,我们建议采用双语义不对称散列法(DSAH)方法,在三重限制下产生歧视性散列代码。首先,DSAHAH在进行类结构分类之前先进行分类,以便在四分化过程中传输类信息。第二,设计一个简单而有效的标签机制,以区分分类内压缩功能和分类间分离数据,从而实现对语义敏感的二元代码学习。最后,我们提出一种有意义的对称相似性保存损失方法,以尽量减少基于亲近性图表的阶级间网络产出之间的距离。首先,DSAHHAH利用类分级结构,通过三种高质量的实验方法,在高级网络上展示了各种数据质量。