Deep neural networks have long training and processing times. Early exits added to neural networks allow the network to make early predictions using intermediate activations in the network in time-sensitive applications. However, early exits increase the training time of the neural networks. We introduce QuickNets: a novel cascaded training algorithm for faster training of neural networks. QuickNets are trained in a layer-wise manner such that each successive layer is only trained on samples that could not be correctly classified by the previous layers. We demonstrate that QuickNets can dynamically distribute learning and have a reduced training cost and inference cost compared to standard Backpropagation. Additionally, we introduce commitment layers that significantly improve the early exits by identifying for over-confident predictions and demonstrate its success.
翻译:深神经网络有很长的培训和处理时间。神经网络中的早期输出使网络能够利用网络中具有时间敏感性的应用中的中间激活进行早期预测。然而,早期退出增加了神经网络的培训时间。我们引入了QuickNets:用于更快地培训神经网络的新型的级联培训算法。QuickNets以分层方式接受了培训,这样每个连续的层都只受过无法被前层正确分类的样本的培训。我们证明QuickNets能够动态地传播学习知识,并且比标准的回路转换成本降低培训成本和推论成本。此外,我们引入了承诺层,通过确定过度自信预测并展示其成功之处,大大改进了早期输出。