In the past years, significant improvements in the field of neural architecture search(NAS) have been made. However, it is still challenging to search for efficient networks due to the gap between the searched constraint and real inference time exists. To search for a high-performance network with low inference time, several previous works set a computational complexity constraint for the search algorithm. However, many factors affect the speed of inference(e.g., FLOPs, MACs). The correlation between a single indicator and the latency is not strong. Currently, some re-parameterization(Rep) techniques are proposed to convert multi-branch to single-path architecture which is inference-friendly. Nevertheless, multi-branch architectures are still human-defined and inefficient. In this work, we propose a new search space that is suitable for structural re-parameterization techniques. RepNAS, a one-stage NAS approach, is present to efficiently search the optimal diverse branch block(ODBB) for each layer under the branch number constraint. Our experimental results show the searched ODBB can easily surpass the manual diverse branch block(DBB) with efficient training.
翻译:在过去几年里,神经结构搜索领域有了重大改进,然而,由于搜索限制与实际推断时间之间的差距,寻找高效的网络仍然很困难。为了寻找一个低推力时间的高性能网络,前几部作品为搜索算法设定了一个计算复杂性限制。然而,许多因素影响推断速度(如FLOPs、MACs),单一指标与延缓度的相关性并不强。目前,由于搜索限制与实际推导时间之间存在差距,因此建议采用一些重新参数化(Rep)技术,将多分管转换为单路结构,这种结构便于推导。然而,多分管结构仍然是人类定义的,效率低下。在这项工作中,我们提出了适合结构再校准技术的新搜索空间。SprepNAS(一个阶段的NAS)方法是高效率地搜索分支数制约下每一层的最佳多样化分支块(ODBB)。我们的实验结果表明,搜索的ODBBBB可以很容易地超过人工多样化的分块(DBBB),并进行高效的培训。