Bayesian optimisation (BO) is a well-known efficient algorithm for finding the global optimum of expensive, black-box functions. The current practical BO algorithms have regret bounds ranging from $\mathcal{O}(\frac{logN}{\sqrt{N}})$ to $\mathcal O(e^{-\sqrt{N}})$, where $N$ is the number of evaluations. This paper explores the possibility of improving the regret bound in the noiseless setting by intertwining concepts from BO and tree-based optimistic optimisation which are based on partitioning the search space. We propose the BOO algorithm, a first practical approach which can achieve an exponential regret bound with order $\mathcal O(N^{-\sqrt{N}})$ under the assumption that the objective function is sampled from a Gaussian process with a Mat\'ern kernel with smoothness parameter $\nu > 4 +\frac{D}{2}$, where $D$ is the number of dimensions. We perform experiments on optimisation of various synthetic functions and machine learning hyperparameter tuning tasks and show that our algorithm outperforms baselines.
翻译:Bayesian 最佳化( BO) 是一个众所周知的高效算法, 用于寻找全球最优化的昂贵黑盒功能。 当前实用的 BO 算法具有从$mathcal{O}( frac{logNunsqrt{N ⁇ ) 美元到$mathcal O( e ⁇ -\\ sqrt{N ⁇ ) 美元( $) 的遗憾界限, 也就是评估的数量。 本文探讨是否有可能在无噪音环境下改善遗憾约束, 由来自 BO 和基于树的乐观优化概念相互交错, 后者以分隔搜索空间为基础。 我们提出了 BOO 算法, 这是一种首个实用方法, 可以用 $\ mathcal O( N ⁇ _\\\\\\\\\\\\ sqrt{N ⁇ ) 美元实现指数式的遗憾, 假设目标函数是从高斯进程抽样, 带有光滑度参数的 $\nu > 4 ⁇ c{ D ⁇ 2} 美元, 来改进无声波质参数。 我们对各种合成功能和机器进行测试, 测试。