We provide an entropy bound for the spaces of neural networks with piecewise linear activation functions, such as the ReLU and the absolute value functions. This bound generalizes the known entropy bound for the space of linear functions on $\mathbb{R}^d$ and it depends on the value at the point $(1,1,...,1)$ of the networks obtained by taking the absolute values of all parameters of original networks. Keeping this value together with the depth, width and the parameters of the networks to have logarithmic dependence on $1/\varepsilon$, we $\varepsilon$-approximate functions that are analytic on certain regions of $\mathbb{C}^d$.
翻译:我们为神经网络的空间提供一个连接的星体, 带有片断线性激活功能, 如 ReLU 和绝对值函数。 这个连接的星体将已知的线性函数空间的星体分布在$\ mathbb{R ⁇ d$ 上, 取决于以原始网络所有参数的绝对值获得的网络的值(1, 1,..., 1)$ 。 将这个值与网络的深度、 宽度和参数保持在一起, 以对数依赖$/\ varepsilon$, 我们$\ varepsilon$- pappot 函数, 这些函数对某些地区的 $\ mathb{C ⁇ d$ 进行分析 。