This article addresses the problem of approximating a function in a Hilbert space by an expansion over a dictionary $\mathbb{D}$. We introduce the notion of a smoothly parameterized dictionary and give upper bounds on the approximation rates, metric entropy and $n$-widths of the absolute convex hull, which we denote $B_1(\mathbb{D})$, of such dictionaries. The upper bounds depend upon the order of smoothness of the parameterization, and improve upon existing results in many cases. The main applications of these results is to the dictionaries $\mathbb{D} = \{\sigma(\omega\cdot x + b)\}\subset L^2$ corresponding to shallow neural networks with activation function $\sigma$, and to the dictionary of decaying Fourier modes corresponding to the spectral Barron space. This improves upon existing approximation rates for shallow neural networks when $\sigma = \text{ReLU}^k$ for $k\geq 2$, sharpens bounds on the metric entropy, and provides the first bounds on the Gelfand $n$-widths of the Barron space and spectral Barron space.
翻译:文章用一个字典 $\ mathbb{D} $ 来描述希尔伯特 空间的函数的近似化问题 。 我们引入了光滑参数字典的概念, 并给绝对 convex 外壳的近光率、 公吨 和 $n- with 值上下限, 我们用它来表示 $_ 1 (\ mathbb{D} 美元 。 上界取决于参数化的平滑度, 并在许多情况下改善现有结果 。 这些结果的主要应用是 $\ mathb{D} = *sigmam (\ omega\ cdot x + b) = sigraph (\ omega\ cdot x + b) subset L% 2 subset L% 2 etcregs 相当于具有激活功能的浅线性网络 $\ sgreme $ gregon 和 mblys $rps 和 bestal- grels on the $ and strys.