This paper presents a novel neural architecture search method, called LiDNAS, for generating lightweight monocular depth estimation models. Unlike previous neural architecture search (NAS) approaches, where finding optimized networks are computationally highly demanding, the introduced novel Assisted Tabu Search leads to efficient architecture exploration. Moreover, we construct the search space on a pre-defined backbone network to balance layer diversity and search space size. The LiDNAS method outperforms the state-of-the-art NAS approach, proposed for disparity and depth estimation, in terms of search efficiency and output model performance. The LiDNAS optimized models achieve results superior to compact depth estimation state-of-the-art on NYU-Depth-v2, KITTI, and ScanNet, while being 7%-500% more compact in size, i.e the number of model parameters.
翻译:本文介绍了一种新的神经结构搜索方法,称为“LiDNAS”,用于生成轻量级单眼深度估计模型。与以往的神经结构搜索方法(NAS)不同,以前寻找优化网络的方法在计算上要求很高,引进的新书《辅助塔布搜索》导致高效的建筑探索。此外,我们在一个预先定义的主干网络上建造搜索空间,以平衡层多样性和搜索空间大小。“LIDNAS”方法在搜索效率和产出模型性能方面超过了为差异和深度估计而提出的最先进的NAS方法。“LIDNAS”优化模型取得的结果优于NYU-Depeh-v2、KITTI和ScanNet的紧凑深度估计,而规模则比模型参数多7%-500%。