Most evolutionary algorithms have parameters, which allow a great flexibility in controlling their behavior and adapting them to new problems. To achieve the best performance, it is often needed to control some of the parameters during optimization, which gave rise to various parameter control methods. In recent works, however, similar advantages have been shown, and even proven, for sampling parameter values from certain, often heavy-tailed, fixed distributions. This produced a family of algorithms currently known as "fast evolution strategies" and "fast genetic algorithms". However, only little is known so far about the influence of these distributions on the performance of evolutionary algorithms, and about the relationships between (dynamic) parameter control and (static) parameter sampling. We contribute to the body of knowledge by presenting, for the first time, an algorithm that computes the optimal static distributions, which describe the mutation operator used in the well-known simple $(1+\lambda)$ evolutionary algorithm on a classic benchmark problem OneMax. We show that, for large enough population sizes, such optimal distributions may be surprisingly complicated and counter-intuitive. We investigate certain properties of these distributions, and also evaluate the performance regrets of the $(1+\lambda)$ evolutionary algorithm using commonly used mutation distributions.
翻译:大多数进化算法都有参数,允许在控制其行为和使其适应新的问题方面有很大的灵活性。为了达到最佳性能,通常需要在优化期间控制某些参数,从而产生各种参数控制方法。然而,在最近的工程中,已经展示了类似的优势,甚至被证明,从某些(往往是重尾的)固定分布中取样参数值。这产生了目前称为“快速进化战略”和“快速遗传算法”的一套算法。然而,对于这些分布对进化算法的性能的影响,以及(动力)参数控制和(静态)参数抽样之间的关系,目前所知甚少。我们第一次展示了一种计算最佳静态分布法,描述了在已知的简单($1 ⁇ lambda)美元和典型基准问题“OneMax”中使用的突变运算算法。我们发现,对于这些分布法的大小,这种最佳分布法可能令人惊讶地复杂和反直觉。我们用的是,我们用这些分布法的某些特性来分析这些分布法的进化性能1美元,并且用普通的进化法评估了这些进化分析结果。