The high cost of acquiring and annotating samples has made the `few-shot' learning problem of prime importance. Existing works mainly focus on improving performance on clean data and overlook robustness concerns on the data perturbed with adversarial noise. Recently, a few efforts have been made to combine the few-shot problem with the robustness objective using sophisticated Meta-Learning techniques. These methods rely on the generation of adversarial samples in every episode of training, which further adds a computational burden. To avoid such time-consuming and complicated procedures, we propose a simple but effective alternative that does not require any adversarial samples. Inspired by the cognitive decision-making process in humans, we enforce high-level feature matching between the base class data and their corresponding low-frequency samples in the pretraining stage via self distillation. The model is then fine-tuned on the samples of novel classes where we additionally improve the discriminability of low-frequency query set features via cosine similarity. On a 1-shot setting of the CIFAR-FS dataset, our method yields a massive improvement of $60.55\%$ & $62.05\%$ in adversarial accuracy on the PGD and state-of-the-art Auto Attack, respectively, with a minor drop in clean accuracy compared to the baseline. Moreover, our method only takes $1.69\times$ of the standard training time while being $\approx$ $5\times$ faster than state-of-the-art adversarial meta-learning methods. The code is available at https://github.com/vcl-iisc/robust-few-shot-learning.
翻译:获取和注解样本的高昂成本使“快速”学习问题变得至关重要。现有工作主要侧重于提高清洁数据的性能,忽视了对在对抗性噪音中受数据侵扰的数据的稳健性关注。最近,利用先进的元学习技术,努力将几发问题与稳健性目标相结合。这些方法依靠在每一期培训中生成对抗性样本,这进一步增加了计算负担。为了避免这种耗时和复杂的程序,我们建议了一个简单而有效的替代方法,不需要任何对抗性样本。在人类认知性决策过程的启发下,我们通过自我蒸馏在培训前阶段对基础类数据及其相应的低频样本进行高水平的匹配。然后,该模型对新类样本进行微调,我们通过类似技术来进一步提高低频调调调数据集的可调性。在CIFAR-FS数据集中,我们的方法是一闪亮的,不需要任何对抗性样本。在人类认知性决策过程的启发下,我们用高水平的特性匹配了60.55美美分美元,而我们的标准培训方法则以低调的基数(BA-GD-G-G-G-G-G-G-G-G-G-I-B-B-B-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-C-C-C-C-C-I-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-I-C-I-I-I-I-I-I-I-I-I-I-I-I-I-I)-C-C-C-C-C-C-C-C-C-C-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-I-