Pre-trained models are widely used in fine-tuning downstream tasks with linear classifiers optimized by the cross-entropy loss, which might face robustness and stability problems. These problems can be improved by learning representations that focus on similarities in the same class and contradictions in different classes when making predictions. In this paper, we utilize the K-Nearest Neighbors Classifier in pre-trained model fine-tuning. For this KNN classifier, we introduce a supervised momentum contrastive learning framework to learn the clustered representations of the supervised downstream tasks. Extensive experiments on text classification tasks and robustness tests show that by incorporating KNNs with the traditional fine-tuning process, we can obtain significant improvements on the clean accuracy in both rich-source and few-shot settings and can improve the robustness against adversarial attacks. \footnote{all codes is available at https://github.com/LinyangLee/KNN-BERT}
翻译:预先培训的模型被广泛用于微调下游任务,通过跨热带损失优化线性分类,可能会面临稳健性和稳定性问题,这些问题可以通过学习表现方式加以改善,在作出预测时侧重于同一类的相似性和不同类别中的矛盾之处。在本文中,我们使用K-Nearest Neearbors分类法进行预先培训的模型微调。对于这个 KNNN 分类法,我们引入了一种有监督的动力对比学习框架,以了解受监督的下游任务的分组表述。关于文本分类任务的广泛实验和稳健性测试表明,通过将KNNPs纳入传统的微调程序,我们可以在丰富来源和少发的环境下获得清洁准确性方面的重大改进,并能够改进对抗对抗性攻击的稳健性。 https://github.com/LinyangeLee/KNNNN-BERT}