Estimation of distribution algorithms (EDAs) provide a distribution - based approach for optimization which adapts its probability distribution during the run of the algorithm. We contribute to the theoretical understanding of EDAs and point out that their distribution approach makes them more suitable to deal with rugged fitness landscapes than classical local search algorithms. Concretely, we make the OneMax function rugged by adding noise to each fitness value. The cGA can nevertheless find solutions with n(1 - \epsilon) many 1s, even for high variance of noise. In contrast to this, RLS and the (1+1) EA, with high probability, only find solutions with n(1/2+o(1)) many 1s, even for noise with small variance.
翻译:分配算法(EDAs)的估算提供了一种基于分配的优化方法,该方法在算法运行期间调整了它的概率分布。我们为对 EDAs的理论理解作出了贡献,并指出,它们的分布方法比传统的本地搜索算法更适合处理崎岖的健身环境。具体地说,我们通过在每个健身价值中添加噪音来使OneMax功能变得坚固。即使噪音差异很大,CGA也可以用n(1 - \ epsilon) 1s many 1s 找到解决办法。与此相反,RLS 和 1+1 EA( 1+1) 极有可能只用n(1/2+o(1)) 1 many 1 找到解决办法,即使噪音差异很小。