The neural tangent kernel is a kernel function defined over the parameter distribution of an infinite width neural network. Despite the impracticality of this limit, the neural tangent kernel has allowed for a more direct study of neural networks and a gaze through the veil of their black box. More recently, it has been shown theoretically that the Laplace kernel and neural tangent kernel share the same reproducing kernel Hilbert space in the space of $\mathbb{S}^{d-1}$ alluding to their equivalence. In this work, we analyze the practical equivalence of the two kernels. We first do so by matching the kernels exactly and then by matching posteriors of a Gaussian process. Moreover, we analyze the kernels in $\mathbb{R}^d$ and experiment with them in the task of regression.
翻译:神经相切内核是无限宽神经网络参数分布的一个内核函数。 尽管这一限制不切实际, 神经相切内核允许对神经网络进行更直接的研究, 并透过黑盒的面纱凝视。 最近, 理论上已经显示, Laplace 内核和神经相切内核在$\ mathbb{S ⁇ d-1} 的空间里拥有相同的复制内核Hilbert空间。 我们在此工作中分析两个内核的实际等值。 我们首先通过精确匹配内核, 然后匹配高斯进程的后端。 此外, 我们用$\ mathb{R} 来分析内核, 并在回归任务中与它们进行实验 。