Most evolutionary algorithms have multiple parameters and their values drastically affect the performance. Due to the often complicated interplay of the parameters, setting these values right for a particular problem (parameter tuning) is a challenging task. This task becomes even more complicated when the optimal parameter values change significantly during the run of the algorithm since then a dynamic parameter choice (parameter control) is necessary. In this work, we propose a lazy but effective solution, namely choosing all parameter values (where this makes sense) in each iteration randomly from a suitably scaled power-law distribution. To demonstrate the effectiveness of this approach, we perform runtime analyses of the $(1+(\lambda,\lambda))$ genetic algorithm with all three parameters chosen in this manner. We show that this algorithm on the one hand can imitate simple hill-climbers like the $(1+1)$ EA, giving the same asymptotic runtime on problems like OneMax, LeadingOnes, or Minimum Spanning Tree. On the other hand, this algorithm is also very efficient on jump functions, where the best static parameters are very different from those necessary to optimize simple problems. We prove a performance guarantee that is comparable, sometimes even better, than the best performance known for static parameters. We complement our theoretical results with a rigorous empirical study confirming what the asymptotic runtime results suggest.
翻译:多数进化算法都有多个参数, 其值会极大地影响性能。 由于参数的相互作用往往非常复杂, 将这些值设定为特定问题( 参数调制) 是一项艰巨的任务。 当最优参数值在算法运行期间发生重大变化时, 这个任务变得更加复杂, 因为自那时起需要动态参数选择( 参数控制 ) 。 在这项工作中, 我们提出一个懒惰但有效的解决方案, 即随机地从适当缩放的电源法分布中选择所有参数值( 在这样合情理的情况下) 。 由于这些参数的相互作用往往非常复杂, 我们用这种方式对美元(1+( lambda,\ lambda) 值) 的基因算法进行运行分析, 并以此方式选择了所有三个参数。 我们显示, 一只手的算法可以模仿一个简单的山- 参数选择( $ ( 1+ 1) EA), 给 OneMax、 TeleadAness, 或 Min Sload Trow 等问题提供同样系统的运行时间。 另一方面, 这个算算算算算法也非常有效, 我们证明一个比已知的实验性化的测试结果要好, 。