Zero-Shot Learning (ZSL) aims to transfer classification capability from seen to unseen classes. Recent methods have proved that generalization and specialization are two essential abilities to achieve good performance in ZSL. However, they all focus on only one of the abilities, resulting in models that are either too general with the degraded classifying ability or too specialized to generalize to unseen classes. In this paper, we propose an end-to-end network with balanced generalization and specialization abilities, termed as BGSNet, to take advantage of both abilities, and balance them at instance- and dataset-level. Specifically, BGSNet consists of two branches: the Generalization Network (GNet), which applies episodic meta-learning to learn generalized knowledge, and the Balanced Specialization Network (BSNet), which adopts multiple attentive extractors to extract discriminative features and fulfill the instance-level balance. A novel self-adjusting diversity loss is designed to optimize BSNet with less redundancy and more diversity. We further propose a differentiable dataset-level balance and update the weights in a linear annealing schedule to simulate network pruning and thus obtain the optimal structure for BSNet at a low cost with dataset-level balance achieved. Experiments on four benchmark datasets demonstrate our model's effectiveness. Sufficient component ablations prove the necessity of integrating generalization and specialization abilities.
翻译:零热学习(ZSL)的目的是将分类能力从可见的分类能力转移到看不见的类别。最近的方法已经证明,一般化和专业化是使ZSL取得良好业绩的两个基本能力。但是,它们都只侧重于其中一个能力,导致模型过于笼统,分解能力退化,或过于专门化,无法推广到看不见的类别。在本文件中,我们提议建立一个端对端网络,具有均衡的概括化和专业化能力,称为BGSNet,以利用这两种能力,并在实例和数据集一级平衡。具体地说,BGSNet由两个分支组成:通用化网络(GNet),它应用上传的元学知识学习来学习普遍知识,平衡化专门化网络(BSNet),它采用多种关注提取器来提取歧视特征,并实现实例级平衡。我们设计了一个全新的自我调整多样性损失自我调整网络,以优化BSNet,减少冗余,增加多样性。我们进一步提议一种不同的数据集平衡,并在一个线性净化模型中更新加权的重量,以模拟网络运行功能化,以便模拟精通化的精度学成精度的精度,从而展示我们已实现的实验性精度的模型化的精度的精度数据结构。