We present new convergence estimates of generalized empirical interpolation methods in terms of the entropy numbers of the parametrized function class. Our analysis is transparent and leads to sharper convergence rates than the classical analysis via the Kolmogorov n-width. In addition, we also derive novel entropy-based convergence estimates of the Chebyshev greedy algorithm for sparse n-term nonlinear approximation of a target function. This also improves classical convergence analysis when corresponding entropy numbers decay fast enough.
翻译:暂无翻译