State-of-the-art unsupervised re-ID methods train the neural networks using a memory-based non-parametric softmax loss. Instance feature vectors stored in memory are assigned pseudo-labels by clustering and updated at instance level. However, the varying cluster sizes leads to inconsistency in the updating progress of each cluster. To solve this problem, we present Cluster Contrast which stores feature vectors and computes contrast loss at the cluster level. Our approach employs a unique cluster representation to describe each cluster, resulting in a cluster-level memory dictionary. In this way, the consistency of clustering can be effectively maintained throughout the pipline and the GPU memory consumption can be significantly reduced. Thus, our method can solve the problem of cluster inconsistency and be applicable to larger data sets. In addition, we adopt different clustering algorithms to demonstrate the robustness and generalization of our framework. The application of Cluster Contrast to a standard unsupervised re-ID pipeline achieves considerable improvements of 9.9%, 8.3%, 12.1% compared to state-of-the-art purely unsupervised re-ID methods and 5.5%, 4.8%, 4.4% mAP compared to the state-of-the-art unsupervised domain adaptation re-ID methods on the Market, Duke, and MSMT17 datasets. Code is available at https://github.com/alibaba/cluster-contrast.
翻译:使用基于内存的非参数软体损耗来训练神经网络的神经网络,使用基于内存的非监视性软体损耗。 存储在记忆中的系统特性矢量可以通过集成和在实例一级更新来指定假标签。 但是, 不同的组群大小导致每个组群更新进度的不一致。 为了解决这个问题, 我们展示了储存矢量并计算分组一级相对损失的群集对比。 我们的方法使用独特的群集代表来描述每个组群, 从而产生一个群集级记忆字典。 这样, 在整个管道中可以有效地保持集群集的一致性, GPU 内存消耗量可以大大降低。 因此, 我们的方法可以解决组群集不一致的问题, 并适用于更大的数据集。 此外, 我们采用不同的组群集算算法来显示我们框架的稳健性和总体化。 将群集对比标准未经监督的重新开发管道应用, 实现了9.9%、 8.3%、 12.1% 与状态完全不受监督的再识别数据交换方法和5.5%、 4.8 % 版版版的MDAP 对比, 将MIS- brode- drode- sal- code- code- sal- dal- sal- commet- sal- dal- sal- setd- sal- sal- am- commal- sald- setd- sal- compal- sald- comd- sald- sal- sald- sald- sal- sal- sal- sald- sald-d-d- sald- smd- sald- sald-d-d-d-d-d-d- sald-d- sald-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d- sal-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d- sal- sal-d-d-d-d-d-d-d-d-