Jump functions are the most studied non-unimodal benchmark in the theory of randomized search heuristics, in particular, evolutionary algorithms (EAs). They have significantly improved our understanding of how EAs escape from local optima. However, their particular structure -- to leave the local optimum one can only jump directly to the global optimum -- raises the question of how representative such results are. For this reason, we propose an extended class $\textsc{Jump}_{k,\delta}$ of jump functions that contain a valley of low fitness of width $\delta$ starting at distance $k$ from the global optimum. We prove that several previous results extend to this more general class: for all $k = o(n^{1/3})$ and $\delta < k$, the optimal mutation rate for the $(1+1)$~EA is $\frac{\delta}{n}$, and the fast $(1+1)$~EA runs faster than the classical $(1+1)$~EA by a factor super-exponential in $\delta$. However, we also observe that some known results do not generalize: the randomized local search algorithm with stagnation detection, which is faster than the fast $(1+1)$~EA by a factor polynomial in $k$ on $\textsc{Jump}_k$, is slower by a factor polynomial in $n$ on some $\textsc{Jump}_{k,\delta}$ instances. Computationally, the new class allows experiments with wider fitness valleys, especially when they lie further away from the global optimum.
翻译:跳跃函数是随机搜索超常学理论中研究最多的非单一模式基准, 特别是进化算法( EAs ) 。 它们极大地提高了我们对EAs如何逃离本地选法的理解。 然而, 它们的特殊结构 -- -- 离开本地最佳功能只能直接跳到全球最佳水平 -- -- 提出了这种结果如何具有代表性的问题。 为此, 我们提议扩大一个包含宽度低的宽度谷值为$\delta$的跳跃函数, 特别是进化算法( EAs EAs ) 。 我们证明以前的一些结果已经扩展到这个更普通的类别: 所有 $= o( n\\ 1/ 3} ) 美元和 $\ delta < k$, 美元的最佳突变率为$( 1+1) 美元; 快速(1+1) 美元 ~ 美元 的跳跃函数, 它比古典的 $( 1+1) 美元 更快。 ~ EAs 运行速度快, 以一个因数超值 超值 $ 美元 $\\\\ deltatata a cal cal ral exest exal real ral real ex