We propose a sparse arithmetic for kernel matrices, enabling efficient scattered data analysis. The compression of kernel matrices by means of samplets yields sparse matrices such that assembly, addition, and multiplication of these matrices can be performed with essentially linear cost. Since the inverse of a kernel matrix is compressible, too, we have also fast access to the inverse kernel matrix by employing exact sparse selected inversion techniques. As a consequence, we can rapidly evaluate series expansions and contour integrals to access, numerically and approximately in a data-sparse format, more complicated matrix functions such as $A^\alpha$ and $\exp(A)$. By exploiting the matrix arithmetic, also efficient Gaussian process learning algorithms for spatial statistics can be realized. Numerical results are presented to quantify and quality our findings.
翻译:我们建议对内核矩阵进行稀疏的算术,以便能够进行有效分散的数据分析。通过抽样压缩内核矩阵可以产生稀少的矩阵,因此这些矩阵的组装、添加和倍增基本上可以用线性成本来进行。由于内核矩阵的反面也是可压缩的,我们也可以通过使用精选的稀疏的反向技术快速进入反内核矩阵。因此,我们可以从数字上和大致上以数据粗略的形式迅速评估一系列扩展和获得的等同组合,以及更复杂的矩阵功能,如$-alpha$和$-exp$(A$)。通过利用矩阵算术,也可以实现高效的高斯进程空间统计学习算法。提供了数字结果,以量化和定性我们的调查结果。