The use of Convolutional Neural Networks (CNNs) is widespread in Deep Learning due to a range of desirable model properties which result in an efficient and effective machine learning framework. However, performant CNN architectures must be tailored to specific tasks in order to incorporate considerations such as the input length, resolution, and dimentionality. In this work, we overcome the need for problem-specific CNN architectures with our Continuous Convolutional Neural Network (CCNN): a single CNN architecture equipped with continuous convolutional kernels that can be used for tasks on data of arbitrary resolution, dimensionality and length without structural changes. Continuous convolutional kernels model long range dependencies at every layer, and remove the need for downsampling layers and task-dependent depths needed in current CNN architectures. We show the generality of our approach by applying the same CCNN to a wide set of tasks on sequential (1$\mathrm{D}$) and visual data (2$\mathrm{D}$). Our CCNN performs competitively and often outperforms the current state-of-the-art across all tasks considered.
翻译:在深层学习中广泛使用进化神经网络(CNN),因为一系列理想模型特性导致高效和高效的机器学习框架,因此在深层学习中广泛使用进化神经网络(CNN),但是,有功能的CNN结构必须适合具体任务,以便纳入投入长度、分辨率和分辨性等考虑。在这项工作中,我们与我们连续进化神经网络(CCNNN)一道,克服了对问题专用CNN结构的需要:一个CNN结构,配有连续的进化内核,可用于任意解析、维度和长度数据的任务,而没有结构变化。持续的进化内核在每一层都建出长距离依赖性模型,并消除了目前CNN结构所需的降序层和任务深度需要。我们用同样的CCNNN对一系列广泛的连续任务(1美元/mathrm{D}美元)和直观数据(2美元/mathrm{D})应用了我们的方法的概括性。