Convolutional Neural Networks (CNNs) have achieved tremendous success in a number of learning tasks including image classification. Recent advanced models in CNNs, such as ResNets, mainly focus on the skip connection to avoid gradient vanishing. DenseNet designs suggest creating additional bypasses to transfer features as an alternative strategy in network design. In this paper, we design Attentive Feature Integration (AFI) modules, which are widely applicable to most recent network architectures, leading to new architectures named AFI-Nets. AFI-Nets explicitly model the correlations among different levels of features and selectively transfer features with a little overhead.AFI-ResNet-152 obtains a 1.24% relative improvement on the ImageNet dataset while decreases the FLOPs by about 10% and the number of parameters by about 9.2% compared to ResNet-152.
翻译:革命神经网络(CNNs)在许多学习任务(包括图像分类)中取得了巨大成功。CNN的最近先进模型,如ResNets(ResNets),主要侧重于跳过连接以避免梯度消失。DenseNet设计建议创建更多的绕行来传输功能,作为网络设计的替代战略。在本文中,我们设计了广泛适用于最新网络结构的快速地物整合模块,导致名为AFI-Nets的新结构。AFI-Nets(AFI-Nets)明确模拟了不同层面地物和有选择地传输特征的关联性,并带有少量间接性。AFI-ResNet-152在图像网络数据集上取得了1.24%的相对改进,同时将FLOPs减少了约10%,参数数量比ResNet-152减少了约9.2%。