The Gaussian kernel and its derivatives have already been employed for Convolutional Neural Networks in several previous works. Most of these papers proposed to compute filters by linearly combining one or several bases of fixed or slightly trainable Gaussian kernels with or without their derivatives. In this article, we propose a high-level configurable layer based on anisotropic, oriented and shifted Gaussian derivative kernels which generalize notions encountered in previous related works while keeping their main advantage. The results show that the proposed layer has competitive performance compared to previous works and that it can be successfully included in common deep architectures such as VGG16 for image classification and U-net for image segmentation.
翻译:Gaussian 内核及其衍生物已在前几部著作中用于革命神经网络,其中多数论文建议通过直线将一个或多个固定或可稍加训练的Gaussian内核的基数与衍生物或无衍生物合并来计算过滤器。在本篇文章中,我们建议建立一个基于厌异、定向和转移的高斯派衍生物内核的高可配置层,该层将以往相关工作中遇到的概念概括化,同时保留其主要优势。结果显示,拟议的层与以往的工程相比,具有竞争性性能,可以成功地纳入共同的深层结构,如VGG16图像分类和U-net图像分割。