Evaluating neural network performance is critical to deep neural network design but a costly procedure. Neural predictors provide an efficient solution by treating architectures as samples and learning to estimate their performance on a given task. However, existing predictors are task-dependent, predominantly estimating neural network performance on image classification benchmarks. They are also search-space dependent; each predictor is designed to make predictions for a specific architecture search space with predefined topologies and set of operations. In this paper, we propose a novel All-in-One Predictor (AIO-P), which aims to pretrain neural predictors on architecture examples from multiple, separate computer vision (CV) task domains and multiple architecture spaces, and then transfer to unseen downstream CV tasks or neural architectures. We describe our proposed techniques for general graph representation, efficient predictor pretraining and knowledge infusion techniques, as well as methods to transfer to downstream tasks/spaces. Extensive experimental results show that AIO-P can achieve Mean Absolute Error (MAE) and Spearman's Rank Correlation (SRCC) below 1% and above 0.5, respectively, on a breadth of target downstream CV tasks with or without fine-tuning, outperforming a number of baselines. Moreover, AIO-P can directly transfer to new architectures not seen during training, accurately rank them and serve as an effective performance estimator when paired with an algorithm designed to preserve performance while reducing FLOPs.
翻译:评价神经网络性能对于深层神经网络设计至关重要,但是一个昂贵的程序。神经预测器提供了高效的解决办法,将建筑作为样本处理,并学习如何估计其在特定任务中的性能。然而,现有的预测器取决于任务,主要是根据图像分类基准估计神经网络性能。它们也依赖搜索空间;每个预测器的设计是为了预测特定建筑搜索空间,其中含有预先界定的地形和一套操作。在本文件中,我们提出了一个新的“全在一中预测器”(AIO-P)和Spearman的“全在一个预测器”(AIO-O-O-P),其目的是在多个、单独的计算机任务领域和多个建筑空间的建筑示例上预设神经预测器,然后转移到看不见的下游的CV任务或神经结构。我们描述了我们提出的一般图形代表、高效的预测器前期培训和知识的技术,以及向下游任务/空间转移的方法。我们提出的“AOP-P”和Spearman的“Corrlation(SRCC)级(SRCC)在多个、0.5和以上计算机任务领域和多个建筑空间空间的建筑模型上进行预设的预设的模型,在不作调整时,在A-CV的基线期间,可以将一个不进行精确的升级,在不精确的基线上进行升级时,在不作调整时,在不进行升级的升级时可以降低。