Approximating a univariate function on the interval $[-1,1]$ with a polynomial is among the most classical problems in numerical analysis. When the function evaluations come with noise, a least-squares fit is known to reduce the effect of noise as more samples are taken. The generic algorithm for the least-squares problem requires $O(Nn^2)$ operations, where $N+1$ is the number of sample points and $n$ is the degree of the polynomial approximant. This algorithm is unstable when $n$ is large, for example $n\gg \sqrt{N}$ for equispaced sample points. In this study, we blend numerical analysis and statistics to introduce a stable and fast $O(N\log N)$ algorithm called NoisyChebtrunc based on the Chebyshev interpolation. It has the same error reduction effect as least-squares and the convergence is spectral until the error reaches $O(\sigma \sqrt{{n}/{N}})$, where $\sigma$ is the noise level, after which the error continues to decrease at the Monte-Carlo $O(1/\sqrt{N})$ rate. To determine the polynomial degree, NoisyChebtrunc employs a statistical criterion, namely Mallows' $C_p$. We analyze NoisyChebtrunc in terms of the variance and concentration in the infinity norm to the underlying noiseless function. These results show that with high probability the infinity-norm error is bounded by a small constant times $\sigma \sqrt{{n}/{N}}$, when the noise {is} independent and follows a subgaussian or subexponential distribution. We illustrate the performance of NoisyChebtrunc with numerical experiments.
翻译:暂无翻译