Few-shot classification (FSC) requires training models using a few (typically one to five) data points per class. Meta learning has proven to be able to learn a parametrized model for FSC by training on various other classification tasks. In this work, we propose PLATINUM (semi-suPervised modeL Agnostic meTa-learnIng usiNg sUbmodular Mutual information), a novel semi-supervised model agnostic meta-learning framework that uses the submodular mutual information (SMI) functions to boost the performance of FSC. PLATINUM leverages unlabeled data in the inner and outer loop using SMI functions during meta-training and obtains richer meta-learned parameterizations for meta-test. We study the performance of PLATINUM in two scenarios - 1) where the unlabeled data points belong to the same set of classes as the labeled set of a certain episode, and 2) where there exist out-of-distribution classes that do not belong to the labeled set. We evaluate our method on various settings on the miniImageNet, tieredImageNet and Fewshot-CIFAR100 datasets. Our experiments show that PLATINUM outperforms MAML and semi-supervised approaches like pseduo-labeling for semi-supervised FSC, especially for small ratio of labeled examples per class.
翻译:微小的分类( FSC) 要求每个班级使用几个( 通常为一至五) 数据点的培训模型。 元学习证明能够通过其他分类任务的培训为 FSC 学习一个匹配模型。 在这项工作中, 我们提议 PLATINUM (半- supervised modeL Agnotical meTa- LearnIng usiNg sUbmodulal 相互信息), 一个新型的半监督的模型, 类似模式的元学习框架, 使用子模式的相互信息( SMI) 来提高 FSC 的性能。 PLAINUM 使用 SMI 函数在内外循环中使用无标签的数据, 并获得较丰富的元测试元的元化元化参数化 。 我们用两种情景研究 PLATINUMUM 的性能, 其中未标定的数据点与某个插图集的类同, 2 存在不属于标签的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级模型的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级, 我们的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的分级的