We develop a transductive meta-learning method that uses unlabelled instances to improve few-shot image classification performance. Our approach combines a regularized Mahalanobis-distance-based soft k-means clustering procedure with a modified state of the art neural adaptive feature extractor to achieve improved test-time classification accuracy using unlabelled data. We evaluate our method on transductive few-shot learning tasks, in which the goal is to jointly predict labels for query (test) examples given a set of support (training) examples. We achieve state of the art performance on the Meta-Dataset, mini-ImageNet and tiered-ImageNet benchmarks. All trained models and code have been made publicly available at github.com/plai-group/simple-cnaps.
翻译:我们开发了一种感应元学习方法,使用未贴标签的事例来改进微小图像分类性能。我们的方法将正规化的Mahalanobis-远程软K means集成程序与改良的神经适应性特征提取器结合起来,以便利用未贴标签的数据提高测试时间分类的准确性。我们评估了我们关于感应性微小学习任务的方法,目的是共同预测查询(测试)示例的标签,并给出了一组支持(培训)实例。我们实现了Meta-Dataset、Mini-ImageNet和分层-ImageNet基准的最新性能。所有经过培训的模型和代码都已在 Github.com/plai-群/stop-cnaps 上公布。