The kernel interpolant in a reproducing kernel Hilbert space is optimal in the worst-case sense among all approximations of a function using the same set of function values. In this paper, we compare two search criteria to construct lattice point sets for use in lattice-based kernel approximation. The first candidate, $\calP_n^*$, is based on the power function that appears in machine learning literature. The second, $\calS_n^*$, is a search criterion used for generating lattices for approximation using truncated Fourier series. We find that the empirical difference in error between the lattices constructed using $\calP_n^*$ and $\calS_n^*$ is marginal. The criterion $\calS_n^*$ is preferred as it is computationally more efficient and has a proven error bound.
翻译:基于格点的核逼近的两种搜索准则的比较
翻译后的摘要:
核插值器在再生核希尔伯特空间中是一种最优的近似方式,它在使用相同的函数值集合时在最坏情况下是最优的。在本文中,我们比较了两种搜索准则,用于构造格点集,以用于基于格点的核逼近。第一个候选者$\calP_n^*$,基于机器学习文献中出现的幂函数。第二个$\calS_n^*$,则是用于生成用于使用截断傅里叶级数逼近的格点的搜索准则。我们发现,使用$\calP_n^*$和$\calS_n^*$构造的格点之间的误差经验差异很小。由于$\calS_n^*$准则具有计算效率更高和已经证明的误差界限,因此更受欢迎。