Training convolutional neural networks (CNNs) with back-propagation (BP) is time-consuming and resource-intensive particularly in view of the need to visit the dataset multiple times. In contrast, analytic learning attempts to obtain the weights in one epoch. However, existing attempts to analytic learning considered only the multilayer perceptron (MLP). In this article, we propose an analytic convolutional neural network learning (ACnnL). Theoretically we show that ACnnL builds a closed-form solution similar to its MLP counterpart, but differs in their regularization constraints. Consequently, we are able to answer to a certain extent why CNNs usually generalize better than MLPs from the implicit regularization point of view. The ACnnL is validated by conducting classification tasks on several benchmark datasets. It is encouraging that the ACnnL trains CNNs in a significantly fast manner with reasonably close prediction accuracies to those using BP. Moreover, our experiments disclose a unique advantage of ACnnL under the small-sample scenario when training data are scarce or expensive.
翻译:分析神经网络(CNNs) 具有后向分析功能的神经网络(CNNs) 具有后向分析功能。 从理论上看,我们发现ACnnL构建了一个类似于 MLP对应方的封闭式解决方案,但其规范化限制却有所不同。因此,我们可以在某种程度上回答为什么有线电视新闻网通常会从隐含的正规化角度比 MLP 更为普及。通过在一些基准数据集进行分类任务,ACnnL得到了验证。令人鼓舞的是,ACnnL对有线电视新闻网进行了快速培训,并合理地近距离预测了使用BP的网络的网络。此外,我们的实验还揭示了在小规模的情景下,当培训数据稀缺或昂贵时,ACnnL的独特优势。