This paper examines the approximation capabilities of coarsely quantized neural networks -- those whose parameters are selected from a small set of allowable values. We show that any smooth multivariate function can be arbitrarily well approximated by an appropriate coarsely quantized neural network and provide a quantitative approximation rate. For the quadratic activation, this can be done with only a one-bit alphabet; for the ReLU activation, we use a three-bit alphabet. The main theorems rely on important properties of Bernstein polynomials. We prove new results on approximation of functions with Bernstein polynomials, noise-shaping quantization on the Bernstein basis, and implementation of the Bernstein polynomials by coarsely quantized neural networks.
翻译:本文审视了粗略量化神经网络的近似能力 -- -- 这些网络的参数是从一小套允许值中选定的。 我们显示,任何平滑的多变量功能都可以任意地被一个适当的粗略量化神经网络所近似, 并提供定量近似率。 对于四边形激活, 只能用一位字母即可完成; 对于 ReLU 激活, 我们使用三位字母。 主要的理论依赖伯恩斯坦多元神经系统的重要特性。 我们证明, 在伯恩斯坦多数值网络的近似功能、 伯恩斯坦多数值网络的噪声振动量和伯恩斯坦多数值网络实施伯恩斯坦多数值网络方面, 取得了新的结果 。