Mutual knowledge distillation (MKD) improves a model by distilling knowledge from another model. However, \textit{not all knowledge is certain and correct}, especially under adverse conditions. For example, label noise usually leads to less reliable models due to undesired memorization \cite{zhang2017understanding,arpit2017closer}. Wrong knowledge misleads the learning rather than helps. This problem can be handled by two aspects: (i) improving the reliability of a model where the knowledge is from (i.e., knowledge source's reliability); (ii) selecting reliable knowledge for distillation. In the literature, making a model more reliable is widely studied while selective MKD receives little attention. Therefore, we focus on studying selective MKD. Concretely, a generic MKD framework, \underline{C}onfident knowledge selection followed by \underline{M}utual \underline{D}istillation (CMD), is designed. The key component of CMD is a generic knowledge selection formulation, making the selection threshold either static (CMD-S) or progressive (CMD-P). Additionally, CMD covers two special cases: zero-knowledge and all knowledge, leading to a unified MKD framework. Extensive experiments are present to demonstrate the effectiveness of CMD and thoroughly justify the design of CMD. For example, CMD-P obtains new state-of-the-art results in robustness against label noise.
翻译:共同知识蒸馏( MKD) 通过从另一个模型中提取知识来改进模型。 然而, 并非所有知识都是肯定和正确的, 特别是在不利的条件下。 例如, 标签噪音通常会导致不理想的记忆化导致不可靠的模型。 错误的知识会误导学习而不是帮助。 这个问题可以由两个方面来解决 :(i) 提高知识来源于( 即, 知识来源的可靠性 ) 的模型的可靠性 ;(ii) 为蒸馏选择可靠的知识。 在文献中, 广泛研究使模型更加可靠的模型,而选择性的MKD却很少受到关注。 因此, 我们侧重于选择性的MKD。 具体地说, 一个通用的 MKD 框架, 下线 {C} 预测性知识选择而不是帮助学习。 这个问题可以由两个方面来解决 :(i) 提高该知识来源于( 知识来源的可靠性 ) 的可靠性;(ii) 选择可靠的知识。 (ii) 在文献中, 使选择模型更加可靠, 使模型更加可靠。 (CMD) 和 快速的 数据库中, (C) 正在演示两个 的C- mDRD 和 C- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c- c-