Learning with noisy labels has gained the enormous interest in the robust deep learning area. Recent studies have empirically disclosed that utilizing dual networks can enhance the performance of single network but without theoretic proof. In this paper, we propose Cooperative Learning (CooL) framework for noisy supervision that analytically explains the effects of leveraging dual or multiple networks. Specifically, the simple but efficient combination in CooL yields a more reliable risk minimization for unseen clean data. A range of experiments have been conducted on several benchmarks with both synthetic and real-world settings. Extensive results indicate that CooL outperforms several state-of-the-art methods.
翻译:以吵闹的标签进行学习已赢得了对强有力的深层学习领域的极大兴趣。最近的研究从经验中揭示,利用双重网络可以提高单一网络的性能,但没有理论证据。在本文中,我们提议合作学习框架用于噪音监督,分析解释利用双重或多重网络的影响。具体地说,CooL中简单而高效的组合可以更可靠地将隐蔽的清洁数据降低到最低程度。在合成和现实世界环境中,对若干基准进行了一系列实验。广泛的结果表明,CooL超越了几种最先进的方法。