When training Convolutional Neural Networks (CNNs) there is a large emphasis on creating efficient optimization algorithms and highly accurate networks. The state-of-the-art method of optimizing the networks is done by using gradient descent algorithms, such as Stochastic Gradient Descent (SGD). However, there are some limitations presented when using gradient descent methods. The major drawback is the lack of exploration, and over-reliance on exploitation. Hence, this research aims to analyze an alternative approach to optimizing neural network (NN) weights, with the use of population-based metaheuristic algorithms. A hybrid between Grey Wolf Optimizer (GWO) and Genetic Algorithms (GA) is explored, in conjunction with SGD; producing a Genetically Modified Wolf optimization algorithm boosted with SGD (GMW-SGD). This algorithm allows for a combination between exploitation and exploration, whilst also tackling the issue of high-dimensionality, affecting the performance of standard metaheuristic algorithms. The proposed algorithm was trained and tested on CIFAR-10 where it performs comparably to the SGD algorithm, reaching high test accuracy, and significantly outperforms standard metaheuristic algorithms.
翻译:当培训进化神经网络时,人们大力强调创造高效优化算法和高度准确的网络,优化网络的最先进方法是通过使用梯度下降算法,如Stochatic Gradientlegents(SGD)来优化网络的。然而,在使用梯度下降方法时,有一些局限性。主要的缺点是缺乏勘探和过度依赖开发。因此,这项研究的目的是分析优化神经网络重量的替代方法,利用基于人口的计量算法。与SGD一起探索灰色狼顶偏振器(GWO)和遗传性阿尔戈里什姆(GAGA)之间的混合法;制作由SGD(GM-SGD)推动的转基因改变的狼优化算法。这种算法允许将开发与勘探相结合,同时处理高度问题,影响标准计量算法的性能。拟议的算法在CIFAR-10上经过了培训和测试,其表现与SGGDA、达到高精确度测试,并大大超出标准。