The unsupervised domain adaptive person re-identification (re-ID) task has been a challenge because, unlike the general domain adaptive tasks, there is no overlap between the classes of source and target domain data in the person re-ID, which leads to a significant domain gap. State-of-the-art unsupervised re-ID methods train the neural networks using a memory-based contrastive loss. However, performing contrastive learning by treating each unlabeled instance as a class will lead to the problem of class collision, and the updating intensity is inconsistent due to the difference in the number of instances of different categories when updating in the memory bank. To address such problems, we propose Prototype Dictionary Learning for person re-ID which is able to utilize both source domain data and target domain data by one training stage while avoiding the problem of class collision and the problem of updating intensity inconsistency by cluster-level prototype dictionary learning. In order to reduce the interference of domain gap on the model, we propose a local-enhance module to improve the domain adaptation of the model without increasing the number of model parameters. Our experiments on two large datasets demonstrate the effectiveness of the prototype dictionary learning. 71.5\% mAP is achieved in the Market-to-Duke task, which is a 2.3\% improvement compared to the state-of-the-art unsupervised domain adaptive re-ID methods. It achieves 83.9\% mAP in the Duke-to-Market task, which improves by 4.4\% compared to the state-of-the-art unsupervised adaptive re-ID methods.
翻译:与一般领域适应性任务不同,个人再识别(重新ID)中源数据和目标域数据类别之间没有重叠,从而导致显著的域差距。 最先进的、不受监督的再识别方法使用基于记忆的对比性损失来培训神经网络。 然而,通过将未标记的每个无标签的人视为一个类别来进行对比性学习,将导致阶级碰撞问题,更新强度也不一致,因为在更新记忆库时不同类别的情况不同。为了解决这些问题,我们提议为个人再识别提供原型词典学习,既使用源域数据,又通过一个培训阶段将目标域数据用于目标域域数据。同时避免阶级碰撞问题和通过集级原型词典学习更新强度不一致问题。为了减少对模型域间差距的干扰,我们建议一个本地强化模块,以便在不增加模型参数数量的情况下改进模型的域的域的域的更替性适应性。我们在两个大数据域的变版性词式学习中, 将Squal-deal-deal-de-al-deal-al-adtravelopal-al-al-laview the laf-de-de-de-de-de-de-laviewststal-de-de-lafal-de-de-de-lavidufal-de-lafal-lax