Neural Architecture Search (NAS) methods autonomously discover high-accuracy neural network architectures, outperforming manually crafted ones. However, The NAS methods require high computational costs due to the high dimension search space and the need to train multiple candidate solutions. This paper introduces LCoDeepNEAT, an instantiation of Lamarckian genetic algorithms, which extends the foundational principles of the CoDeepNEAT framework. LCoDeepNEAT co-evolves CNN architectures and their respective final layer weights. The evaluation process of LCoDeepNEAT entails a single epoch of SGD, followed by the transference of the acquired final layer weights to the genetic representation of the network. In addition, it expedites the process of evolving by imposing restrictions on the architecture search space, specifically targeting architectures comprising just two fully connected layers for classification. Our method yields a notable improvement in the classification accuracy of candidate solutions throughout the evolutionary process, ranging from 2% to 5.6%. This outcome underscores the efficacy and effectiveness of integrating gradient information and evolving the last layer of candidate solutions within LCoDeepNEAT. LCoDeepNEAT is assessed across six standard image classification datasets and benchmarked against eight leading NAS methods. Results demonstrate LCoDeepNEAT's ability to swiftly discover competitive CNN architectures with fewer parameters, conserving computational resources, and achieving superior classification accuracy compared to other approaches.
翻译:暂无翻译