Deep neural networks (NN) perform well in various tasks (e.g., computer vision) because of the convolutional neural networks (CNN). However, the difficulty of gathering quality data in the industry field hinders the practical use of NN. To cope with this issue, the concept of transfer learning (TL) has emerged, which leverages the fine-tuning of NNs trained on large-scale datasets in data-scarce situations. Therefore, this paper suggests a two-stage architectural fine-tuning method for image classification, inspired by the concept of neural architecture search (NAS). One of the main ideas of our proposed method is a mutation with base architectures, which reduces the search cost by using given architectural information. Moreover, an early-stopping is also considered which directly reduces NAS costs. Experimental results verify that our proposed method reduces computational and searching costs by up to 28.2% and 22.3%, compared to existing methods.
翻译:深神经网络(NN)在各种任务(例如计算机视觉)方面表现良好,这是因为神经网络在不断演变。 然而,在工业领域收集高质量数据的困难妨碍了NN的实际使用。 为了解决这一问题,出现了转移学习的概念(TL),通过微调在数据萎缩的情况下对受过大规模数据集培训的NNC人员进行数据流失情况下的大规模数据集培训。因此,本文件建议了一种两阶段的图像分类建筑微调方法,受神经结构搜索(NAS)概念的启发。我们拟议方法的主要想法之一是与基础建筑发生突变,通过使用特定的建筑信息降低了搜索成本。此外,还考虑了早期停止,直接降低NAS的成本。实验结果证实,与现有方法相比,我们提议的计算和搜索方法减少了28.2%和22.3%。