Data augmentation is a powerful technique for improving the performance of the few-shot classification task. It generates more samples as supplements, and then this task can be transformed into a common supervised learning issue for solution. However, most mainstream data augmentation based approaches only consider the single modality information, which leads to the low diversity and quality of generated features. In this paper, we present a novel multi-modal data augmentation approach named Dizygotic Conditional Variational AutoEncoder (DCVAE) for addressing the aforementioned issue. DCVAE conducts feature synthesis via pairing two Conditional Variational AutoEncoders (CVAEs) with the same seed but different modality conditions in a dizygotic symbiosis manner. Subsequently, the generated features of two CVAEs are adaptively combined to yield the final feature, which can be converted back into its paired conditions while ensuring these conditions are consistent with the original conditions not only in representation but also in function. DCVAE essentially provides a new idea of data augmentation in various multi-modal scenarios by exploiting the complement of different modality prior information. Extensive experimental results demonstrate our work achieves state-of-the-art performances on miniImageNet, CIFAR-FS and CUB datasets, and is able to work well in the partial modality absence case.
翻译:数据增强是改进微小分类任务绩效的有力技术。 它生成更多的样本作为补充, 然后可以将这项任务转换成共同监督的学习问题, 然而, 多数主流数据增强方法只考虑单一模式信息, 导致生成特征的多样性和质量较低。 在本文中, 我们展示了一种新的多模式数据增强方法, 名为Dizygisty Contitional variational Autencoder (DCVAE), 用于解决上述问题。 DCVAE 通过利用不同模式之前的信息补充, 将两个条件性自动自动编码器(CVAE)与相同的种子但不同模式条件配对, 以便解决问题。 之后, 多数主流数据增强方法的生成特征是适应性地结合产生最终特征, 而这些特性可以转换成配对性条件, 同时确保这些条件不仅在代表性方面,而且在功能上也符合原有条件。 DCVAE通过利用不同模式的先前信息补充, 在不同的种子和不同模式下, 模式下将数据增强数据化新概念。 FAFS- 广泛实验性地展示了我们在MFS- C- Set- C- C- Slag- s- s- s- a ex- proal- preal laction- proal lades- sal- laction- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- avic- sal- sal- sal- sal- laction- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- a- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal- sal-fal- sal- sal-