Auxiliary information attracts more and more attention in the area of machine learning. Attempts so far to include such auxiliary information in state-of-the-art learning process have often been based on simply appending these auxiliary features to the data level or feature level. In this paper, we intend to propose a novel training method with new options and architectures. Siamese labels, which were used in the training phase as auxiliary modules. While in the testing phase, the auxiliary module should be removed. Siamese label module makes it easier to train and improves the performance in testing process. In general, the main contributions can be summarized as, 1) Siamese Labels are firstly proposed as auxiliary information to improve the learning efficiency; 2) We establish a new architecture, Siamese Labels Auxiliary Network (SilaNet), which is to assist the training of the model; 3) Siamese Labels Auxiliary Network is applied to compress the model parameters by 50% and ensure the high accuracy at the same time. For the purpose of comparison, we tested the network on CIFAR-10 and CIFAR100 using some common models. The proposed SilaNet performs excellent efficiency both on the accuracy and robustness.
翻译:辅助性信息在机器学习领域吸引了越来越多的关注。迄今为止,将这类辅助性信息纳入最新学习过程的尝试往往仅仅将这些辅助性特征附在数据级别或特征级别上。我们打算在本文件中提出一种具有新选项和结构的新培训方法。在培训阶段用作辅助模块的暹罗标签;在测试阶段,辅助性模块应当删除。暹罗标签模块使得培训和改进测试过程的性能更加容易。一般而言,主要贡献可以概括为:(1) Siamese Labels首先提议作为提高学习效率的辅助性信息;(2) 我们建立一个新的架构,即Siamse Labels辅助性网络(SilaNet),目的是协助模型的培训;(3) Siames Labels辅助性网络用于将模型参数压缩50%,并确保在同一时间确保高准确性。为了比较的目的,我们利用一些共同模型对CIFAR-10和CIFAR-100的网络进行了测试。