In this short note, we reify the connection between work on the storage capacity problem in wide two-layer treelike neural networks and the rapidly-growing body of literature on kernel limits of wide neural networks. Concretely, we observe that the "effective order parameter" studied in the statistical mechanics literature is exactly equivalent to the infinite-width Neural Network Gaussian Process Kernel. This correspondence connects the expressivity and trainability of wide two-layer neural networks.
翻译:在这个简短的注释中,我们重新将大两层类似树状神经网络储存能力问题的工作与关于大神经网络内核限度的迅速增长的文献结合起来。具体地说,我们注意到,统计力学文献中研究的“有效定序参数”与无限宽度神经网络高斯立进程内核完全等同。这种对应联系了大两层神经网络的表达性和可训练性。