A deep neural network trained on noisy labels is known to quickly lose its power to discriminate clean instances from noisy ones. After the early learning phase has ended, the network memorizes the noisy instances, which leads to a degradation in generalization performance. To resolve this issue, we propose MARVEL (MARgins Via Early Learning), where we track the goodness of "fit" for every instance by maintaining an epoch-history of its classification margins. Based on consecutive negative margins, we discard suspected noisy instances by zeroing out their weights. In addition, MARVEL+ upweights arduous instances enabling the network to learn a more nuanced representation of the classification boundary. Experimental results on benchmark datasets with synthetic label noise show that MARVEL outperforms other baselines consistently across different noise levels, with a significantly larger margin under asymmetric noise.
翻译:众所周知,一个在噪音标签上受过训练的深层神经网络很快丧失了对噪音标签进行区分的权力。 在早期学习阶段结束后,网络回忆起了吵闹事件,导致一般表现的退化。为了解决这个问题,我们提议MARVEL(MARGINS Via Aidlear Learning)(MARVEL)(MARVEL)(MARVER)(MARVER)(MARGins Via Aidentlearn Learning)(MARVER)(MARVEL)(MARVEL)(MARVER)(超重)(MARVAL)(超重)(MARVAL)(超重)(MARVEL)(超重)(MARVER)(MARVER)(超重)(T)(超重)(SARVER)(T)(PER)(超重)(超重,使得网络能够了解更细的分类界限的表达方式。合成标签噪音的实验结果显示,MARVEL(MARVEL(MARVEL)(MAR)(MARVER)(C)(C)(C)(C)(MARVER)(C)(N)(P)(P)(N)(N)(N)(N)(N)(N)(N)(P)(P)(N)(N)(N)(N)(N)(P)(P)(P)(P)(N)(N)(P)(P)(P)(T)(T)(T)(T)(T)(PER)(N)(N)(NUT)(N)(P)(P)(P)(P)(P)(N)(P)(N)(P)(P)(P)(P)(P)(PL)(PL)(N)(N)(PL)(L)(N)(N)(P)(P)(PR)(P)(N)(N)(N)(N)(N)(N)(N)(P)(N)(N)(N)(G)(G)(P