This paper focuses on establishing $L^2$ approximation properties for deep ReLU convolutional neural networks (CNNs) in two-dimensional space. The analysis is based on a decomposition theorem for convolutional kernels with a large spatial size and multi-channels. Given the decomposition result, the property of the ReLU activation function, and a specific structure for channels, a universal approximation theorem of deep ReLU CNNs with classic structure is obtained by showing its connection with one-hidden-layer ReLU neural networks (NNs). Furthermore, approximation properties are obtained for one version of neural networks with ResNet, pre-act ResNet, and MgNet architecture based on connections between these networks.
翻译:本文侧重于在二维空间为深 ReLU 革命神经网络(CNNs)建立$L ⁇ 2$的近似特性,该分析基于空间大小大、多通道的革命内核分解理论。鉴于分解结果、RELU激活功能的特性和频道的具体结构,通过展示其与一层再生神经网络(NNs)的联系,获得了具有经典结构的深RELUCNN的全近近似特性。此外,基于这些网络之间的连接,与ResNet、行动前ResNet和MgNet的神经网络的一个版本、行动前ResNet和MgNet结构,也获得了近似特性。