Multi-label zero-shot learning (ZSL) is a more realistic counter-part of standard single-label ZSL since several objects can co-exist in a natural image. However, the occurrence of multiple objects complicates the reasoning and requires region-specific processing of visual features to preserve their contextual cues. We note that the best existing multi-label ZSL method takes a shared approach towards attending to region features with a common set of attention maps for all the classes. Such shared maps lead to diffused attention, which does not discriminatively focus on relevant locations when the number of classes are large. Moreover, mapping spatially-pooled visual features to the class semantics leads to inter-class feature entanglement, thus hampering the classification. Here, we propose an alternate approach towards region-based discriminability-preserving multi-label zero-shot classification. Our approach maintains the spatial resolution to preserve region-level characteristics and utilizes a bi-level attention module (BiAM) to enrich the features by incorporating both region and scene context information. The enriched region-level features are then mapped to the class semantics and only their class predictions are spatially pooled to obtain image-level predictions, thereby keeping the multi-class features disentangled. Our approach sets a new state of the art on two large-scale multi-label zero-shot benchmarks: NUS-WIDE and Open Images. On NUS-WIDE, our approach achieves an absolute gain of 6.9% mAP for ZSL, compared to the best published results.
翻译:多标签零弹学习(ZSL)是标准单标签 ZSL 中一个更现实的反面部分,因为一些对象可以在自然图像中同时存在。然而,多对象的出现使得推理复杂化,需要针对特定区域的视觉特征处理以保存其背景提示。我们注意到,现有最佳的多标签 ZSL 方法在关注区域特征方面采取了一种共同的方法,为所有类别绘制一套共同的关注地图。这些共享的地图导致人们的注意力分散,在班级数量众多时,不会有区别地关注相关地点。此外,将空间集合的视觉特征映射到班级的语义特征中,导致阶级特征纠缠绕,从而妨碍分类的分类。在这里,我们提出了一种针对基于区域的差异性-保留多标签零弹分级分类的替代方法。我们的方法保持空间分辨率,以维护区域层面特性,并使用双级关注模块(BiAM),通过纳入区域和场景背景信息来丰富区域层面的特征。随后将丰富区域层面的特征映射到班级的SLSLT- Ralimal-alal-al-al-al-al-al-al-al-laomal-al-al-al-al-al-al-al-al-al-al-al-Sal-Sal-Sal-Sal-Sal-Sal-Sal-Sal-Sal-Sal-Sal-Sal-Sal-Sal-Sal-s-s-I-lation-lock-Sal-Sal-Sal-lation-lation-lation-lation-lation-lation-lation-lation-lation-lation-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-lation-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-l-