Bayesian optimization is a procedure that allows obtaining the global optimum of black-box functions and that is useful in applications such as hyper-parameter optimization. Uncertainty estimates over the shape of the objective function are instrumental in guiding the optimization process. However, these estimates can be inaccurate if the objective function violates assumptions made within the underlying model (e.g., Gaussianity). We propose a simple algorithm to calibrate the uncertainty of posterior distributions over the objective function as part of the Bayesian optimization process. We show that by improving the uncertainty estimates of the posterior distribution with calibration, Bayesian optimization makes better decisions and arrives at the global optimum in fewer steps. We show that this technique improves the performance of Bayesian optimization on standard benchmark functions and hyperparameter optimization tasks.
翻译:Bayesian优化是一种程序,它能够使黑箱功能在全球达到最佳效果,并且对超参数优化等应用有用。对目标功能形状的不确定性估计有助于引导优化进程。然而,如果客观功能违反基础模型(例如高斯)的假设,这些估计数可能不准确。我们提出了一个简单的算法,作为巴耶斯优化进程的一部分,对目标功能的后方分布的不确定性进行校准。我们表明,通过改进校准后方分布的不确定性估计,Bayesian优化可以作出更好的决定,以更少的步骤到达全球最佳状态。我们表明,这一技术提高了巴耶斯优化标准基准功能和超参数优化任务的绩效。