In this paper, we look at the problem of cross-domain few-shot classification that aims to learn a classifier from previously unseen classes and domains with few labeled samples. We study several strategies including various adapter topologies and operations in terms of their performance and efficiency that can be easily attached to existing methods with different meta-training strategies and adapt them for a given task during meta-test phase. We show that parametric adapters attached to convolutional layers with residual connections performs the best, and significantly improves the performance of the state-of-the-art models in the Meta-Dataset benchmark with minor additional cost. Our code will be available at https://github.com/VICO-UoE/URL.
翻译:在本文中,我们审视了跨域的微小分类问题,目的是从以前未见的类别和领域中学习分类者,只有很少贴标签的样本;我们研究好几种战略,包括各种适应者地形学和操作,其性能和效率可以很容易地与采用不同元培训战略的现有方法挂钩,并在元测试阶段根据特定任务加以调整;我们显示,与残余连接的富集层相连的参数适应者表现最佳,大大改善Meta-Dataset基准中最先进的模型的性能,并承担少量额外费用。我们的代码将在https://github.com/VICO-UoE/URL上查阅。