The existence of noisy labels in the dataset causes significant performance degradation for deep neural networks (DNNs). To address this problem, we propose a Meta Soft Label Generation algorithm called MSLG, which can jointly generate soft labels using meta-learning techniques and learn DNN parameters in an end-to-end fashion. Our approach adapts the meta-learning paradigm to estimate optimal label distribution by checking gradient directions on both noisy training data and noise-free meta-data. In order to iteratively update soft labels, meta-gradient descent step is performed on estimated labels, which would minimize the loss of noise-free meta samples. In each iteration, the base classifier is trained on estimated meta labels. MSLG is model-agnostic and can be added on top of any existing model at hand with ease. We performed extensive experiments on CIFAR10, Clothing1M and Food101N datasets. Results show that our approach outperforms other state-of-the-art methods by a large margin.
翻译:为了解决这一问题,我们提议采用Meta Soft Label General算法,即MSLG, 它可以使用元学习技术联合生成软标签,并以端到端的方式学习DNN参数。我们的方法是调整元学习模式,通过检查噪音培训数据和无噪音元数据上的梯度方向,来估计最佳标签分布。为了迭代更新软标签,在估计标签上采取了元梯位下降步骤,这将最大限度地减少无噪音的元样品损失。在每次迭代中,基础分类器都接受估计元标签的培训。MSLG是模型,可以轻松地在任何现有模型之外添加。我们在CFAR10、Strafl1M和Food101N数据集上进行了广泛的实验。结果显示,我们的方法大大超越了其他最先进的方法。