The family of Mat\'ern kernels are often used in spatial statistics, function approximation and Gaussian process methods in machine learning. One reason for their popularity is the presence of a smoothness parameter that controls, for example, optimal error bounds for kriging and posterior contraction rates in Gaussian process regression. On closed Riemannian manifolds, we show that the smoothness parameter can be consistently estimated from the maximizer(s) of the Gaussian likelihood when the underlying data are from point evaluations of a Gaussian process and, perhaps surprisingly, even when the data comprise evaluations of a non-Gaussian process. The points at which the process is observed need not have any particular spatial structure beyond quasi-uniformity. Our methods are based on results from approximation theory for the Sobolev scale of Hilbert spaces. Moreover, we generalize a well-known equivalence of measures phenomenon related to Mat\'ern kernels to the non-Gaussian case by using Kakutani's theorem.
翻译:暂无翻译