This paper aims to explore the feasibility of neural architecture search (NAS) given only a pre-trained model without using any original training data. This is an important circumstance for privacy protection, bias avoidance, etc., in real-world scenarios. To achieve this, we start by synthesizing usable data through recovering the knowledge from a pre-trained deep neural network. Then we use the synthesized data and their predicted soft-labels to guide neural architecture search. We identify that the NAS task requires the synthesized data (we target at image domain here) with enough semantics, diversity, and a minimal domain gap from the natural images. For semantics, we propose recursive label calibration to produce more informative outputs. For diversity, we propose a regional update strategy to generate more diverse and semantically-enriched synthetic data. For minimal domain gap, we use input and feature-level regularization to mimic the original data distribution in latent space. We instantiate our proposed framework with three popular NAS algorithms: DARTS, ProxylessNAS and SPOS. Surprisingly, our results demonstrate that the architectures discovered by searching with our synthetic data achieve accuracy that is comparable to, or even higher than, architectures discovered by searching from the original ones, for the first time, deriving the conclusion that NAS can be done effectively with no need of access to the original or called natural data if the synthesis method is well designed. Our code will be publicly available.
翻译:本文旨在探索神经结构搜索(NAS)的可行性, 仅以事先经过训练的模型为条件, 而不使用任何原始的培训数据。 这是隐私保护、 避免偏向等重要条件, 在现实世界情景中, 这是隐私保护、 避免偏向等的重要条件 。 为了实现这一点, 我们首先通过从经过训练的深层神经网络中恢复知识, 合成可用数据。 然后我们使用综合数据及其预测的软标签来引导神经结构搜索。 我们确定NAS 任务需要综合数据( 我们在这里图像域的目标), 并且有足够的语义、 多样性和自然图像最小的域间距。 对于语义, 我们提出循环标签校准, 以产生更多信息输出结果。 对于多样性, 我们提出一个区域更新战略, 通过从经过事先训练的深层神经网络中回收知识。 对于最小的域间距, 我们使用输入和地平调来模拟原始数据分布在潜层中。 我们用三种流行的NAS 算法将我们的原始框架 : DARSS、 ProxylesnNAS 和 SPS 最小的域间距差差差。 。 。 。 令人惊讶地, 我们的结果会显示我们的原始的原始结构会通过从原始的合成方法, 被发现, 或合成的建筑结构被复制的原结构被探索的 将无法 被复制到合成的方法被复制到 。