Meta learning approaches to few-shot classification are computationally efficient at test time, requiring just a few optimization steps or single forward pass to learn a new task, but they remain highly memory-intensive to train. This limitation arises because a task's entire support set, which can contain up to 1000 images, must be processed before an optimization step can be taken. Harnessing the performance gains offered by large images thus requires either parallelizing the meta-learner across multiple GPUs, which may not be available, or trade-offs between task and image size when memory constraints apply. We improve on both options by proposing LITE, a general and memory efficient episodic training scheme that enables meta-training on large tasks composed of large images on a single GPU. We achieve this by observing that the gradients for a task can be decomposed into a sum of gradients over the task's training images. This enables us to perform a forward pass on a task's entire training set but realize significant memory savings by back-propagating only a random subset of these images which we show is an unbiased approximation of the full gradient. We use LITE to train meta-learners and demonstrate new state-of-the-art accuracy on the real-world ORBIT benchmark and 3 of the 4 parts of the challenging VTAB+MD benchmark relative to leading meta-learners. LITE also enables meta-learners to be competitive with transfer learning approaches but at a fraction of the test-time computational cost, thus serving as a counterpoint to the recent narrative that transfer learning is all you need for few-shot classification.
翻译:微粒分类的元化学习方法在测试时间是计算效率的, 只需要几个优化步骤或单个前方传球就可以学习新任务, 但是它们仍然是高度记忆密集的训练。 这一限制之所以产生,是因为任务的全部支持组, 能够包含多达1000张图像, 必须在采取优化步骤之前处理。 利用大型图像带来的绩效收益, 要么在多个 GPU 之间同时使用元- 疏漏器, 要么在应用记忆限制时将任务与图像大小相权衡。 我们通过提出 LITE 、 一个通用的和记忆高效的缓存培训计划, 来改进这两个选项, 从而能够对由单个 GPU 上大型图像组成的大型任务进行元培训。 我们通过观察, 一项任务所需的梯度可与任务培训图像的渐变总和值相匹配。 这使我们能够对任务的全部培训集进行前导传递, 但是通过对任务和图像的反向后演化分类实现显著的记忆节约。 我们用LITE( LITE) 来对一个具有挑战性的成本比值的 RB 。 我们用一个测试的 Restal- brealalalalalalalalalalalalal 4balal bedal bedal