We study the problem of approximating compactly-supported integrable functions while implementing their support set using feedforward neural networks. Our first main result transcribes this "structured" approximation problem into a universality problem. We do this by constructing a refinement of the usual topology on the space $L^1_{\operatorname{loc}}(\mathbb{R}^d,\mathbb{R}^D)$ of locally-integrable functions in which compactly-supported functions can only be approximated in $L^1$-norm by functions with matching discretized support. We establish the universality of ReLU feedforward networks with bilinear pooling layers in this refined topology. Consequentially, we find that ReLU feedforward networks with bilinear pooling can approximate compactly supported functions while implementing their discretized support. We derive a quantitative uniform version of our universal approximation theorem on the dense subclass of compactly-supported Lipschitz functions. This quantitative result expresses the depth, width, and the number of bilinear pooling layers required to construct this ReLU network via the target function's regularity, the metric capacity and diameter of its essential support, and the dimensions of the inputs and output spaces. Conversely, we show that polynomial regressors and analytic feedforward networks are not universal in this space.
翻译:我们研究如何利用向向神经网络提供反馈的神经网络来落实其支持设置,以近似压缩支持的内在功能。 我们的第一个主要结果将这个“ 结构化” 近似问题转换成普遍性问题。 我们这样做的方法是完善空间上通常的地形学 $L ⁇ 1 ⁇ ⁇ operatorname{loc ⁇ {( mathbb{R ⁇ d,\mathbb{R ⁇ D) 的配置, 即本地可配置的功能, 即由匹配的神经网络, 只能以每平流支持的功能, 以每平流支持的辅助功能, 以每平流化的辅助功能, 以每平流化的辅助功能, 以每平流化的顶端网络为基础, 建立双线化的向前网络。 因此, 我们发现, 使用双线联合的向前网络, 可以在实施离散支持的同时, 近流支持功能。 我们从密集的精密亚级里普西维茨功能上, 表示常规的深度、宽度和双线化的集合层支持层。 我们通过这个目标网络, 显示其直径直径主输出的输出, 功能显示我们所需的直径基流流的输出和直径径流的输出的输出的输出的输出的输出, 。