Genetic algorithms constitute a family of black-box optimization algorithms, which take inspiration from the principles of biological evolution. While they provide a general-purpose tool for optimization, their particular instantiations can be heuristic and motivated by loose biological intuition. In this work we explore a fundamentally different approach: Given a sufficiently flexible parametrization of the genetic operators, we discover entirely new genetic algorithms in a data-driven fashion. More specifically, we parametrize selection and mutation rate adaptation as cross- and self-attention modules and use Meta-Black-Box-Optimization to evolve their parameters on a set of diverse optimization tasks. The resulting Learned Genetic Algorithm outperforms state-of-the-art adaptive baseline genetic algorithms and generalizes far beyond its meta-training settings. The learned algorithm can be applied to previously unseen optimization problems, search dimensions & evaluation budgets. We conduct extensive analysis of the discovered operators and provide ablation experiments, which highlight the benefits of flexible module parametrization and the ability to transfer (`plug-in') the learned operators to conventional genetic algorithms.
翻译:遗传算法是一类黑盒优化算法,其灵感来自生物进化的原则。尽管它们提供了一种通用的优化工具,但它们的具体实例可能是启发式的,并且受到松散的生物学直觉的启发。在这项工作中,我们探讨了一种基本不同的方法:通过提供基因算子的足够灵活的参数化,我们可以通过数据驱动的方法发现全新的基因算法。更具体地说,我们将选择和变异率适应性参数化为交叉和自注意力模块,并使用元黑盒优化来在一组不同的优化任务上进化它们的参数。结果,学习到的遗传算法优于最先进的自适应基准遗传算法,并且远远超出了其元训练设置。学习到的算法可以应用于以前未见过的优化问题、搜索维度和评估预算。我们对发现的操作进行了广泛的分析,并提供了消融实验,强调了灵活的模块参数化的好处以及将学习到的操作器转移到传统的遗传算法的能力。