In the past years, significant improvements in the field of neural architecture search(NAS) have been made. However, it is still challenging to search for efficient networks due to the gap between the searched constraint and real inference time exists. To search for a high-performance network with low inference time, several previous works set a computational complexity constraint for the search algorithm. However, many factors affect the speed of inference(e.g., FLOPs, MACs). The correlation between a single indicator and the latency is not strong. Currently, some re-parameterization(Rep) techniques are proposed to convert multi-branch to single-path architecture which is inference-friendly. Nevertheless, multi-branch architectures are still human-defined and inefficient. In this work, we propose a new search space that is suitable for structural re-parameterization techniques. RepNAS, a one-stage NAS approach, is present to efficiently search the optimal diverse branch block(ODBB) for each layer under the branch number constraint. Our experimental results show the searched ODBB can easily surpass the manual diverse branch block(DBB) with efficient training. Code and models will be available sooner.
翻译:在过去几年里,神经结构搜索领域有了重大改进,然而,由于搜索限制与实际推断时间之间存在差距,寻找高效网络仍是一项艰巨的任务,因为搜索限制与实际推断时间之间存在差距。为寻找高性能网络,低推引力时间,前几部工程为搜索算法设定了一个计算复杂性限制。然而,许多因素影响推断速度(如FLOPs、MACs),单一指标与延缓度之间的关系不强。目前,建议采用一些重新参数化(Rep)技术,将多分管转换为单路结构,这种结构便于推断。然而,多分管结构仍由人界定,效率不高。在这项工作中,我们提出了适合结构再校准技术的新搜索空间。SprepNAS(一个阶段的NAS)方法是高效率地搜索每一层的最佳多元分支块(ODBBB),我们的实验结果显示,搜索的ODBBBB(ODBBB)模型将很快超过手式多样化部门标准。