Person re-ID matches persons across multiple non-overlapping cameras. Despite the increasing deployment of airborne platforms in surveillance, current existing person re-ID benchmarks' focus is on ground-ground matching and very limited efforts on aerial-aerial matching. We propose a new benchmark dataset - AG-ReID, which performs person re-ID matching in a new setting: across aerial and ground cameras. Our dataset contains 21,983 images of 388 identities and 15 soft attributes for each identity. The data was collected by a UAV flying at altitudes between 15 to 45 meters and a ground-based CCTV camera on a university campus. Our dataset presents a novel elevated-viewpoint challenge for person re-ID due to the significant difference in person appearance across these cameras. We propose an explainable algorithm to guide the person re-ID model's training with soft attributes to address this challenge. Experiments demonstrate the efficacy of our method on the aerial-ground person re-ID task. The dataset will be published and the baseline codes will be open-sourced to facilitate research in this area.
翻译:尽管在监视中越来越多地部署空降平台,但现有人重置基准的重点是地面匹配,空中匹配方面的努力非常有限。我们提议新的基准数据集——AG-ReID,该数据集在一个新的环境中进行人重置匹配:跨空中和地面摄像头。我们的数据集包含21 983张388个身份的图像和每个身份的15个软属性。这些数据是由一架在15至45米高度飞行的无人驾驶飞行器和在大学校园的地面闭路电视摄像机收集的。我们的数据集对人重置提出了新的高视点挑战,因为这些摄像机在人与人之间出现差异很大。我们提出了一种可解释的算法,用以指导人重置模型的软性培训,以应对这一挑战。实验表明我们的方法在空中重置人重置任务上的有效性。将公布数据集,基线代码将公开来源,以便利这一领域的研究。</s>