Structural re-parameterization (Rep) methods has achieved significant performance improvement on traditional convolutional network. Most current Rep methods rely on prior knowledge to select the reparameterization operations. However, the performance of architecture is limited by the type of operations and prior knowledge. To break this restriction, in this work, an improved re-parameterization search space is designed, which including more type of re-parameterization operations. Concretely, the performance of convolutional networks can be further improved by the search space. To effectively explore this search space, an automatic re-parameterization enhancement strategy is designed based on neural architecture search (NAS), which can search a excellent re-parameterization architecture. Besides, we visualize the output features of the architecture to analyze the reasons for the formation of the re-parameterization architecture. On public datasets, we achieve better results. Under the same training conditions as ResNet, we improve the accuracy of ResNet-50 by 1.82% on ImageNet-1k.
翻译:在传统的革命网络上,结构再定性(Rep)方法取得了显著的性能改进。大多数当前方法依靠先前的知识来选择再定性操作。然而,建筑的性能受到操作类型和先前知识的限制。为了打破这一限制,在这项工作中,设计了一个改良的再定性搜索空间,其中包括更多的再定性操作类型。具体地说,通过搜索空间可以进一步改进革命网络的性能。为了有效地探索这一搜索空间,一个自动再定性增强战略是以神经结构搜索(NAS)为基础设计的,可以搜索一个极好的再量化结构。此外,我们将建筑的输出特征视觉化,以分析再量化结构形成的原因。在公共数据集中,我们取得更好的结果。在ResNet的相同培训条件下,我们在图像Net-1k上将ResNet-50的精度提高1.82%。