Given a zero-mean Gaussian random field with a covariance function that belongs to a parametric family of covariance functions, we introduce a new notion of likelihood approximations, termed truncated-likelihood functions. Truncated-likelihood functions are based on direct functional approximations of the presumed family of covariance functions. For compactly supported covariance functions, within an increasing-domain asymptotic framework, we provide sufficient conditions under which consistency and asymptotic normality of estimators based on truncated-likelihood functions are preserved. We apply our result to the family of generalized Wendland covariance functions and discuss several examples of Wendland approximations. For families of covariance functions that are not compactly supported, we combine our results with the covariance tapering approach and show that ML estimators, based on truncated-tapered likelihood functions, asymptotically minimize the Kullback-Leibler divergence, when the taper range is fixed.
翻译:鉴于一个零位高斯随机字段,该字段具有属于共变函数的参数组的共变函数,我们引入了一种新的可能性近似值概念,称为短短相似函数。短似函数基于推定共变函数组的直接功能近似值。对于由近似函数组组成的常规共变函数,我们提供了足够条件,在这种条件下,根据短相相似函数组保持估算者的一致性和无常常常值。我们将我们的结果应用到通用的温德兰常变函数组,并讨论韦德兰近似函数组的几个例子。对于不靠紧凑支持的共变函数组,我们将我们的结果与共变相缩缩缩方法结合起来,并表明,在确定磁带范围时,根据挂线-多位概率函数,ML估计者会不同时将Kullback-Libel差值最小化。