We consider linear approximation based on function evaluations in reproducing kernel Hilbert spaces of certain analytic weighted power series kernels and stationary kernels on the interval $[-1,1]$. Both classes contain the popular Gaussian kernel $K(x, y) = \exp(-\tfrac{1}{2}\varepsilon^2(x-y)^2)$. For weighted power series kernels we derive almost matching upper and lower bounds on the worst-case error. When applied to the Gaussian kernel, our results state that, up to a sub-exponential factor, the $n$th minimal error decays as $(\varepsilon/n)^n (n!)^{-1/2}$. The proofs are based on weighted polynomial interpolation and classical polynomial coefficient estimates.
翻译:暂无翻译