In the design of deep neural architectures, recent studies have demonstrated the benefits of grouping subnetworks into a larger network. For examples, the Inception architecture integrates multi-scale subnetworks and the residual network can be regarded that a residual unit combines a residual subnetwork with an identity shortcut. In this work, we embrace this observation and propose the Competitive Pathway Network (CoPaNet). The CoPaNet comprises a stack of competitive pathway units and each unit contains multiple parallel residual-type subnetworks followed by a max operation for feature competition. This mechanism enhances the model capability by learning a variety of features in subnetworks. The proposed strategy explicitly shows that the features propagate through pathways in various routing patterns, which is referred to as pathway encoding of category information. Moreover, the cross-block shortcut can be added to the CoPaNet to encourage feature reuse. We evaluated the proposed CoPaNet on four object recognition benchmarks: CIFAR-10, CIFAR-100, SVHN, and ImageNet. CoPaNet obtained the state-of-the-art or comparable results using similar amounts of parameters. The code of CoPaNet is available at: https://github.com/JiaRenChang/CoPaNet.
翻译:在深神经结构的设计中,最近的研究表明了将子网络分组成一个更大的网络的好处。举例来说,初始结构将多规模子网络和剩余网络结合成多规模子网络和剩余网络,可以认为残余单位将剩余子网络与身份快捷键结合起来。在这项工作中,我们接受这一观察,并提出竞争性途径网络(COPANet) 。CoPaNet由一系列竞争性路径单位组成,每个单元都包含多个平行的残余类型子网络,随后为特征竞争进行最大操作。这一机制通过学习子网络的各种特征,增强了模型能力。拟议战略明确显示,通过各种路径模式路径路径传播的特征,被称为分类信息的路径编码。此外,交叉快捷键可以加入CoPaNet,以鼓励特征再利用。我们根据四个目标识别基准评估了拟议的CoPaNet:CIFAR-10、CIFAR-100、SVHN和imageNet。 CoPaNet利用类似数量的参数获得了状态艺术或可比结果。CoPaNet的代码可查到: https://Regast/Cohm.