In deep learning, auxiliary training has been widely used to assist the training of models. During the training phase, using auxiliary modules to assist training can improve the performance of the model. During the testing phase, auxiliary modules can be removed, so the test parameters are not increased. In this paper, we propose a novel auxiliary training method, Siamese Labels Auxiliary Learning (SiLa). Unlike Deep Mutual Learning (DML), SiLa emphasizes auxiliary learning and can be easily combined with DML. In general, the main work of this paper include: (1) propose SiLa Learning, which improves the performance of common models without increasing test parameters; (2) compares SiLa with DML and proves that SiLa can improve the generalization of the model; (3) SiLa is applied to Dynamic Neural Networks, and proved that SiLa can be used for various types of network structures.
翻译:在深层学习中,辅助培训被广泛用于协助模型培训。在培训阶段,使用辅助模块协助培训可以提高模型的性能。在测试阶段,辅助模块可以被删除,因此测试参数不会增加。在本文中,我们提出了一个新的辅助培训方法,即Siamse Labels辅助学习(Sila)。与深层相互学习(DML)不同,SiLa强调辅助学习,并且可以很容易地与DML相结合。一般而言,本文件的主要工作包括:(1) 提议Sila学习,在不增加测试参数的情况下改进通用模型的性能;(2) 将Sila与DML比较,并证明SiLa可以改进模型的一般化;(3) SiLa应用到动态神经网络,并证明Sila可以用于各种网络结构。