One of the central problems in the study of deep learning theory is to understand how the structure properties, such as depth, width and the number of nodes, affect the expressivity of deep neural networks. In this work, we show a new connection between the expressivity of deep neural networks and topological entropy from dynamical system, which can be used to characterize depth-width trade-offs of neural networks. We provide an upper bound on the topological entropy of neural networks with continuous semi-algebraic units by the structure parameters. Specifically, the topological entropy of ReLU network with $l$ layers and $m$ nodes per layer is upper bounded by $O(l\log m)$. Besides, if the neural network is a good approximation of some function $f$, then the size of the neural network has an exponential lower bound with respect to the topological entropy of $f$. Moreover, we discuss the relationship between topological entropy, the number of oscillations, periods and Lipschitz constant.
翻译:深层学习理论研究的中心问题之一是了解深层神经网络的深度、宽度和节点数量等结构特性如何影响深层神经网络的表达性。在这项工作中,我们展示了深神经网络和动态系统表层昆虫之间的新联系,可用于确定神经网络深度-宽度取舍的特征。我们根据结构参数对神经网络的表层-宽度和节点的连续半血管单元提供上层连接。具体地说,RLU网络的表层和每层美元节点的表层的表层昆虫受美元(l\logm m)的高度约束。此外,如果神经网络是某些功能的良好近似值,那么神经网络的大小则与1美元表层昆虫的大小有指数更小的界限。此外,我们讨论的是表层昆虫、层数、时段和Lipschitz恒定值之间的关系。