The current trend of applying transfer learning from CNNs trained on large datasets can be an overkill when the target application is a custom and delimited problem with enough data to train a network from scratch. On the other hand, the training of custom and lighter CNNs requires expertise, in the from-scratch case, and or high-end resources, as in the case of hardware-aware neural architecture search (HW NAS), limiting access to the technology by non-habitual NN developers. For this reason, we present Colab NAS, an affordable HW NAS technique for producing lightweight task-specific CNNs. Its novel derivative-free search strategy, inspired by Occam's razor, allows it to obtain state-of-the-art results on the Visual Wake Word dataset in just 4.5 GPU hours using free online GPU services such as Google Colaboratory and Kaggle Kernel.
翻译:在大型数据集方面受过培训的有线电视新闻网的转移学习,如果目标应用是一种定制和限定的问题,并有足够的数据从零开始对网络进行培训,那么目前应用从有线电视新闻网进行转移学习的趋势可能就是一个超乎寻常的趋势。 另一方面,对有线电视新闻网和较轻的有线新闻网进行培训需要专门知识,包括从零星案件,或高端资源,如硬件智能神经结构搜索(HW NAS),这限制了非常住NNNT开发商获取技术的机会。 为此,我们介绍了Colab NAS,这是一种负担得起的有价的有价的有价的有价国家有价国家新闻网技术,用于制作轻量任务有线电视新闻网,其新型的无衍生产品搜索战略受到Occam的剃刀的启发,使其能够利用Google Colaboratora和Kagle Kernel等免费的在线GPUPU服务,在仅仅4.5小时的视觉唤醒字数据集上获得最新的结果。