Bayesian optimization is a sequential procedure for obtaining the global optimum of black-box functions without knowing a priori their true form. Good uncertainty estimates over the shape of the objective function are essential in guiding the optimization process. However, these estimates can be inaccurate if the true objective function violates assumptions made by its model (e.g., Gaussianity). This paper studies which uncertainties are needed in Bayesian optimization models and argues that ideal uncertainties should be calibrated -- i.e., an 80% predictive interval should contain the true outcome 80% of the time. We propose a simple algorithm for enforcing this property and show that it enables Bayesian optimization to arrive at the global optimum in fewer steps. We provide theoretical insights into the role of calibrated uncertainties and demonstrate the improved performance of our method on standard benchmark functions and hyperparameter optimization tasks.
翻译:Bayesian优化是获得全球最佳黑盒功能的顺序程序,无需事先了解其真实形式。对目标功能形状的准确的不确定性估计对于指导优化进程至关重要。然而,如果真正的客观功能违背其模型的假设(例如高山性),这些估算可能不准确。本文对Bayesian优化模型需要的不确定性进行了研究,认为理想的不确定性应当加以校准,即80%的预测间隔应包含80%的时间里的真实结果。我们提出了执行这一属性的简单算法,并表明它能够使Bayesian优化以更少的步骤达到全球最佳水平。我们从理论上深入了解校准不确定性的作用,并展示我们在标准基准功能和超光度优化任务方面方法的改进绩效。