Convergence of deep neural networks as the depth of the networks tends to infinity is fundamental in building the mathematical foundation for deep learning. In a previous study, we investigated this question for deep ReLU networks with a fixed width. This does not cover the important convolutional neural networks where the widths are increasing from layer to layer. For this reason, we first study convergence of general ReLU networks with increasing widths and then apply the results obtained to deep convolutional neural networks. It turns out the convergence reduces to convergence of infinite products of matrices with increasing sizes, which has not been considered in the literature. We establish sufficient conditions for convergence of such infinite products of matrices. Based on the conditions, we present sufficient conditions for piecewise convergence of general deep ReLU networks with increasing widths, and as well as pointwise convergence of deep ReLU convolutional neural networks.
翻译:深神经网络的趋同,因为网络的深度往往具有无限性,因此,深神经网络的趋同对于建立深深学习的数学基础至关重要。在先前的一项研究中,我们调查了深ReLU网络的这一问题,但并未包括从层到层的宽度正在增加的重要革命性神经网络。为此,我们首先研究普通RLU网络的趋同,其宽度正在增加,然后将所获得的结果应用于深革命性神经网络。结果显示,趋同程度下降,成为无穷无穷的、规模越来越大的矩阵产品的趋同,而文献中并未考虑到这一点。我们为这种无限的矩阵产品的趋同建立了充分的条件。根据这些条件,我们为总RELU网络的分节化融合提供了充分的条件,以越来越宽的宽度为基础,以及深RELU革命性神经网络的分点融合。