In this work, we present an extension of the genetic algorithm (GA) which exploits the supervised learning technique called active subspaces (AS) to evolve the individuals on a lower dimensional space. In many cases, GA requires in fact more function evaluations than others optimization method to converge to the global optimum. Thus, complex and high-dimensional functions may result extremely demanding (from computational viewpoint) to optimize with the standard algorithm. To address this issue, we propose to linearly map the input parameter space of the original function onto its AS before the evolution, performing the mutation and mate processes in a lower dimensional space. In this contribution, we describe the novel method called ASGA, presenting differences and similarities with the standard GA method. We test the proposed method over n-dimensional benchmark functions -- Rosenbrock, Ackley, Bohachevsky, Rastrigin, Schaffer N. 7, and Zakharov -- and finally we apply it to an aeronautical shape optimization problem.
翻译:在这项工作中,我们展示了基因算法(GA)的延伸,该算法利用了所谓的主动子空间(AS)的监督下学习技术,使个人在较低维空间上演进。在许多情况下,GA实际上需要比其他的优化方法更多的功能评价,才能与全球的最佳方法趋同。因此,复杂和高维的函数可能会(从计算角度)产生极其严格的要求,以便与标准算法优化。为了解决这一问题,我们提议在进化之前将原始函数的输入参数空间线性地映射到AS上,在低维空间进行突变和配对过程。在这个贡献中,我们描述了称为ASGA的新方法,展示了与标准GA方法的不同和相似之处。我们用正维基准函数来测试拟议的方法 -- -- Rosenbrock、Ackley、Bohachevsky、Rastrigin、Schaffer N. 7和Zakharov -- -- 最后我们将其应用于一个航空形状优化问题。