Bayesian optimization is a popular formalism for global optimization, but its computational costs limit it to expensive-to-evaluate functions. A competing, computationally more efficient, global optimization framework is optimistic optimization, which exploits prior knowledge about the geometry of the search space in form of a dissimilarity function. We investigate to which degree the conceptual advantages of Bayesian Optimization can be combined with the computational efficiency of optimistic optimization. By mapping the kernel to a dissimilarity, we obtain an optimistic optimization algorithm for the Bayesian Optimization setting with a run-time of up to $\mathcal{O}(N \log N)$. As a high-level take-away we find that, when using stationary kernels on objectives of relatively low evaluation cost, optimistic optimization can be strongly preferable over Bayesian optimization, while for strongly coupled and parametric models, good implementations of Bayesian optimization can perform much better, even at low evaluation cost. We argue that there is a new research domain between geometric and probabilistic search, i.e. methods that run drastically faster than traditional Bayesian optimization, while retaining some of the crucial functionality of Bayesian optimization.
翻译:Bayesian优化是全球优化的流行形式主义,但其计算成本将它限制在昂贵的到评估的功能上。一个竞争的、计算效率更高的全球优化框架是乐观的优化,它利用了以不同功能的形式对搜索空间的几何学的先前知识。我们调查了在何种程度上Bayesian优化的概念优势可以与乐观优化的计算效率相结合。通过将骨髓绘制成一个差异性,我们获得了一种对Bayesian最佳化设置的乐观优化算法,其运行时间达到$\mathcal{O}(N\log N$)的运行时间。作为高层次的取走者,我们发现,在使用固定内核用于相对较低的评估成本的目标时,乐观的优化比Bayesian优化更可取,而对于紧密结合和偏差的模型来说,良好实施Bayesian优化可以取得更好的效果,即使是低的评估成本。我们说,在几何测量和准稳定性搜索之间有一个新的研究领域,即,在保持比传统的Bayes最佳功能要快得多的方法的同时,保持一些关键的Bayes优化。