We study the approximation of shift-invariant or equivariant functions by deep fully convolutional networks from the dynamical systems perspective. We prove that deep residual fully convolutional networks and their continuous-layer counterpart can achieve universal approximation of these symmetric functions at constant channel width. Moreover, we show that the same can be achieved by non-residual variants with at least 2 channels in each layer and convolutional kernel size of at least 2. In addition, we show that these requirements are necessary, in the sense that networks with fewer channels or smaller kernels fail to be universal approximators.
翻译:我们从动态系统的角度,通过深层的全演化网络从动态系统的角度,研究变换或等同功能的近似值;我们证明,深残余的全演化网络及其连续层对等网络能够在恒定的频道宽度上实现这些对称功能的普遍近似值;此外,我们证明,每个层至少有两个频道的非累进变异体和至少2个进化内核大小的非累进变异体也可以达到同样的效果。 此外,我们表明,这些要求是必要的,因为频道较少或内核较小的网络不能成为普遍近似者。