Deep Convolutional Neural Networks (CNN) have evolved as popular machine learning models for image classification during the past few years, due to their ability to learn the problem-specific features directly from the input images. The success of deep learning models solicits architecture engineering rather than hand-engineering the features. However, designing state-of-the-art CNN for a given task remains a non-trivial and challenging task, especially when training data size is less. To address this phenomena, transfer learning has been used as a popularly adopted technique. While transferring the learned knowledge from one task to another, fine-tuning with the target-dependent Fully Connected (FC) layers generally produces better results over the target task. In this paper, the proposed AutoFCL model attempts to learn the structure of FC layers of a CNN automatically using Bayesian optimization. To evaluate the performance of the proposed AutoFCL, we utilize five pre-trained CNN models such as VGG-16, ResNet, DenseNet, MobileNet, and NASNetMobile. The experiments are conducted on three benchmark datasets, namely CalTech-101, Oxford-102 Flowers, and UC Merced Land Use datasets. Fine-tuning the newly learned (target-dependent) FC layers leads to state-of-the-art performance, according to the experiments carried out in this research. The proposed AutoFCL method outperforms the existing methods over CalTech-101 and Oxford-102 Flowers datasets by achieving the accuracy of 94.38% and 98.89%, respectively. However, our method achieves comparable performance on the UC Merced Land Use dataset with 96.83% accuracy. The source codes of this research are available at https://github.com/shabbeersh/AutoFCL.
翻译:深层神经网络(CNN)在过去几年中演变为通用的机器学习模型,用于图像分类,在过去几年中,由于它们能够直接从输入图像中学习特定问题的特性,因此成为流行的机器学习模式。深层学习模型的成功需要的是建筑工程而不是手工设计这些特性。然而,为某项任务设计最先进的CNN仍然是一项非三重和具有挑战性的任务,特别是在培训数据规模较小的情况下。为了应对这一现象,转移学习被作为一种普遍采用的技术。在将所学知识从一个任务转换到另一个任务的同时,与依赖目标的完全连接(FC)层进行微调,通常会为目标任务带来更好的结果。在本论文中,AutoFCLL模型试图学习CN的自动结构结构。为了评估拟议的AutoFCL的绩效,我们使用了五种经过预先训练的CNN模型,如VGG-16、ResNet、Denseseret、MoveeralterNet和NASNetMBIBLB。在三个基准数据集上进行了实验,即CTeal-hax-hax-laudal-laudal-dal-ladal-dal-Lex-lex-levelers,这是最新的数据流-d-ledge-dal-d-leveldal-dal-d-laudal-dal-dal-dal-d-d-d-d-d-dalex-lex-lection-lex-dal-lexxx-lemental-dal-dal-lex-d-lection-d-lection-lection-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-lexal-dal-lexal-dal-lexal-lexaldal-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-d-