Few-Shot Learning (FSL) algorithms have made substantial progress in learning novel concepts with just a handful of labelled data. To classify query instances from novel classes encountered at test-time, they only require a support set composed of a few labelled samples. FSL benchmarks commonly assume that those queries come from the same distribution as instances in the support set. However, in a realistic set-ting, data distribution is plausibly subject to change, a situation referred to as Distribution Shift (DS). The present work addresses the new and challenging problem of Few-Shot Learning under Support/Query Shift (FSQS) i.e., when support and query instances are sampled from related but different distributions. Our contributions are the following. First, we release a testbed for FSQS, including datasets, relevant baselines and a protocol for a rigorous and reproducible evaluation. Second, we observe that well-established FSL algorithms unsurprisingly suffer from a considerable drop in accuracy when facing FSQS, stressing the significance of our study. Finally, we show that transductive algorithms can limit the inopportune effect of DS. In particular, we study both the role of Batch-Normalization and Optimal Transport (OT) in aligning distributions, bridging Unsupervised Domain Adaptation with FSL. This results in a new method that efficiently combines OT with the celebrated Prototypical Networks. We bring compelling experiments demonstrating the advantage of our method. Our work opens an exciting line of research by providing a testbed and strong baselines. Our code is available at https://github.com/ebennequin/meta-domain-shift.
翻译:少许热学( FSL) 算法在学习新概念方面取得了长足进展,只使用少量贴标签的数据。 为了对测试时遇到的新类的查询案例进行分类, 它们只需要一个由几个标签样本组成的支持组。 FSL 基准通常假设这些查询来自与支持组一样的分布。 但是, 在现实的设置组中, 数据分配是值得相信的, 一种被称为 批发转换( DS) 的情况。 当前的工作解决了在支持/ 查询转换 ( FSQS) 下少许开学( 支持和查询转换) 的新的挑战性优势问题。 也就是说, 当支持和查询案例来自相关但不同的分布的样本时, 它们只需要一个由几个标签样本组成的成套样本组成的成套支持组。 我们所做的贡献如下。 首先, 我们为 FSQQS 发布了一个测试( 我们的快速读数/ ) 和我们的研究, 我们的研究显示, 快速算算法可以限制 OVOVS 的过渡性计算结果。