Deep supervised hashing for image retrieval has attracted researchers' attention due to its high efficiency and superior retrieval performance. Most existing deep supervised hashing works, which are based on pairwise/triplet labels, suffer from the expensive computational cost and insufficient utilization of the semantics information. Recently, deep classwise hashing introduced a classwise loss supervised by class labels information alternatively; however, we find it still has its drawback. In this paper, we propose an improved deep classwise hashing, which enables hashing learning and class centers learning simultaneously. Specifically, we design a two-step strategy on center similarity learning. It interacts with the classwise loss to attract the class center to concentrate on the intra-class samples while pushing other class centers as far as possible. The centers similarity learning contributes to generating more compact and discriminative hashing codes. We conduct experiments on three benchmark datasets. It shows that the proposed method effectively surpasses the original method and outperforms state-of-the-art baselines under various commonly-used evaluation metrics for image retrieval.
翻译:深层监督图像检索的散列已经引起研究人员的注意,因为其效率高,检索性能优异。大部分现有的深层监督散列工程都以双向/三重标签为基础,其计算成本昂贵,对语义信息利用不足。最近,深层类散列引入了由类标签信息监督的阶级损失;然而,我们发现它仍有缺陷。在本文件中,我们建议改进深层类散列,以便能够同时进行散列学习和班级中心学习。具体地说,我们设计了一个关于类似中心学习的两步战略。它与等级损失相互作用,以吸引班级中心集中研究类内样本,同时尽可能推动其他类中心。类似性学习有助于产生更多紧凑和歧视性的散列代码。我们在三个基准数据集上进行实验。它表明,拟议的方法实际上超过了原始方法,超越了在各种常用的图像检索评价指标下的最新基线。