We provide an entropy bound for the spaces of path norm regularized neural networks with piecewise linear activation functions, such as the ReLU and the absolute value functions. This bound generalizes the known entropy bound for the spaces of linear functions on $\mathbb{R}^d$. Keeping the path norm together with the depth, width and the weights of networks to have logarithmic dependence on $1/\varepsilon$, we $\varepsilon$-approximate functions that are analytic on certain regions of $\mathbb{C}^d$.
翻译:我们为路径规范规范正常的神经网络空间提供一个连接的导体, 带有小片线性激活功能, 如 ReLU 和绝对值函数。 这个连接将已知的线性函数空间的天体环统化为$\ mathbb{R ⁇ d$。 保持路径规范以及网络的深度、 宽度和重量, 以对数依赖$/\varepsilon$, 我们$\ varepsilon$- 近距离函数, 在某些区域具有分析价值$\ mathbb{C ⁇ d$ 。