Computing low-rank approximations of kernel matrices is an important problem with many applications in scientific computing and data science. We propose methods to efficiently approximate and store low-rank approximations to kernel matrices that depend on certain hyperparameters. The main idea behind our method is to use multivariate Chebyshev function approximation along with the tensor train decomposition of the coefficient tensor. The computations are in two stages: an offline stage, which dominates the computational cost and is parameter-independent, and an online stage, which is inexpensive and instantiated for specific hyperparameters. A variation of this method addresses the case that the kernel matrix is symmetric and positive semi-definite. The resulting algorithms have linear complexity in terms of the sizes of the kernel matrices. We investigate the efficiency and accuracy of our method on parametric kernel matrices induced by various kernels, such as the Mat\'ern kernel, through various numerical experiments. Our methods have speedups up to $200\times$ in the online time compared to other methods with similar complexity and comparable accuracy.
翻译:暂无翻译