A well-known line of work (Barron, 1993; Breiman, 1993; Klusowski & Barron, 2018) provides bounds on the width $n$ of a ReLU two-layer neural network needed to approximate a function $f$ over the ball $\mathcal{B}_R(\R^d)$ up to error $\epsilon$, when the Fourier based quantity $C_f = \int_{\R^d} \|\xi\|^2 |\hat{f}(\xi)| \ d\xi$ is finite. More recently Ongie et al. (2019) used the Radon transform as a tool for analysis of infinite-width ReLU two-layer networks. In particular, they introduce the concept of Radon-based $\mathcal{R}$-norms and show that a function defined on $\R^d$ can be represented as an infinite-width two-layer neural network if and only if its $\mathcal{R}$-norm is finite. In this work, we extend the framework of Ongie et al. (2019) and define similar Radon-based semi-norms ($\mathcal{R}, \mathcal{U}$-norms) such that a function admits an infinite-width neural network representation on a bounded open set $\mathcal{U} \subseteq \R^d$ when its $\mathcal{R}, \mathcal{U}$-norm is finite. Building on this, we derive sparse (finite-width) neural network approximation bounds that refine those of Breiman (1993); Klusowski & Barron (2018). Finally, we show that infinite-width neural network representations on bounded open sets are not unique and study their structure, providing a functional view of mode connectivity.
翻译:众所周知的工作线 (Barron, 1993; Breiman, 1993; Klusowski & Barron, 2018) 提供ReLU 两层神经网络宽度的界限, 以在球上约合一美元 $mathcal{B ⁇ R( R ⁇ d) 美元, 直至错误$\ epsilon$, 当基于 $C_ f =\ int\\\\\\\ r} {xi ⁇ 2\ h{ h{ f} (\xi) {lusowkn_ dxxxi 美元是有限的。 最近 Ongi 等人( 2019) 将Radon 转换用作工具, 分析无线 relentre relaterl+mal{Brmal_ ralma} 的功能。 特别是, 它们引入了基于 $\ calmath} Rentral_ droal_ comma 的功能, 当它以 ral- ral- ral- ral_al_ lade lade a wemas lax lax lax lax lax lax a.