Knowledge transfer using convolutional neural networks (CNNs) can help efficiently train a CNN with fewer parameters or maximize the generalization performance under limited supervision. To enable a more efficient transfer of pretrained knowledge under relaxed conditions, we propose a simple yet powerful knowledge transfer methodology without any restrictions regarding the network structure or dataset used, namely self-supervised knowledge transfer (SSKT), via loosely supervised auxiliary tasks. For this, we devise a training methodology that transfers previously learned knowledge to the current training process as an auxiliary task for the target task through self-supervision using a soft label. The SSKT is independent of the network structure and dataset, and is trained differently from existing knowledge transfer methods; hence, it has an advantage in that the prior knowledge acquired from various tasks can be naturally transferred during the training process to the target task. Furthermore, it can improve the generalization performance on most datasets through the proposed knowledge transfer between different problem domains from multiple source networks. SSKT outperforms the other transfer learning methods (KD, DML, and MAXL) through experiments under various knowledge transfer settings. The source code will be made available to the public.
翻译:利用进化神经网络(CNNs)进行知识转让,可以有效地培训参数较少的CNN,或者在有限的监督下最大限度地提高通用性能。为了能够在放松的条件下更有效地转让预先培训的知识,我们提议了一个简单而有力的知识转让方法,对所使用的网络结构或数据集不加任何限制,即通过松散监督的辅助任务进行自我监督的知识转让。为此,我们设计了一种培训方法,将以前学到的知识转让给目前的培训过程,作为使用软标签进行自我监督的辅助任务。SSKT独立于网络结构和数据集,其培训不同于现有的知识转让方法;因此,它有一个优势,即在培训过程中,从各种任务中获得的知识可以自然地转移到目标任务。此外,它可以通过从多个源网络的不同问题领域之间拟议的知识转让来改进大多数数据集的普及性表现。SSKT通过在各种知识转让环境中的实验,超越了其他转让学习方法(KD、DML和MAXL)的功能。源代码将提供给公众。