Study of neural networks with infinite width is important for better understanding of the neural network in practical application. In this work, we derive the equivalence of the deep, infinite-width maxout network and the Gaussian process (GP) and characterize the maxout kernel with a compositional structure. Moreover, we build up the connection between our deep maxout network kernel and deep neural network kernels. We also give an efficient numerical implementation of our kernel which can be adapted to any maxout rank. Numerical results show that doing Bayesian inference based on the deep maxout network kernel can lead to competitive results compared with their finite-width counterparts and deep neural network kernels. This enlightens us that the maxout activation may also be incorporated into other infinite-width neural network structures such as the convolutional neural network (CNN).
翻译:对无限宽度神经网络的研究对于更好地了解实际应用中的神经网络十分重要。 在这项工作中,我们得出深、无限断层网络和高森进程等值,并将最大内核与构成结构定性为最大内核。此外,我们还建立了我们深度断层网络内核和深神经网络内核之间的联系。我们还高效地对内核进行数字化应用,使其适应任何最大级。数字结果显示,根据深最大网络内核进行贝耶斯式推论,与有限的断层网络内核和深神经网络内核相比,可以导致竞争性的结果。这使我们认识到,最大起爆也可以纳入其他无边神经网络结构,如卷发神经网络。