Few-shot learning (FSL), which aims to classify unseen classes with few samples, is challenging due to data scarcity. Although various generative methods have been explored for FSL, the entangled generation process of these methods exacerbates the distribution shift in FSL, thus greatly limiting the quality of generated samples. To these challenges, we propose a novel Information Bottleneck (IB) based Disentangled Generation Framework for FSL, termed as DisGenIB, that can simultaneously guarantee the discrimination and diversity of generated samples. Specifically, we formulate a novel framework with information bottleneck that applies for both disentangled representation learning and sample generation. Different from existing IB-based methods that can hardly exploit priors, we demonstrate our DisGenIB can effectively utilize priors to further facilitate disentanglement. We further prove in theory that some previous generative and disentanglement methods are special cases of our DisGenIB, which demonstrates the generality of the proposed DisGenIB. Extensive experiments on challenging FSL benchmarks confirm the effectiveness and superiority of DisGenIB, together with the validity of our theoretical analyses. Our codes will be open-source upon acceptance.
翻译:尽管已经为FSL探索了各种基因化方法,但这些方法的混合生成过程加剧了FSL的分布变化,从而极大地限制了所产生样品的质量。为了应对这些挑战,我们提议为FSL建立一个以DisGenIB(称为DisGenIB)为基础的基于信息瓶颈(IB)的分解生成框架,这个框架可以同时保证所产生样品的差别和多样性。具体地说,我们制定了一个具有信息瓶颈的新框架,其中含有信息瓶颈,既适用于分解的代表性学习,也适用于样本生成。不同于现有的基于IB(基于IB)的、几乎无法利用前科的方法,我们展示我们的DSGenIB(D)能够有效地利用以前的方法来进一步促进分解。我们从理论上进一步证明,我们以前的DisGenIB(称为DisGenIB)的一些先前的分解生成方法是我们迪GenIB(DisGenIB)的特例,这表明拟议的DSGenIB(DisGenIB)的普及性和优越性。我们对DGenIB(DenIB)的基准和理论分析的正确性进行了广泛的实验,我们的守则将在接受时是开源。