Bayesian optimization is a broadly applied methodology to optimize the expensive black-box function. Despite its success, it still faces the challenge from the high-dimensional search space. To alleviate this problem, we propose a novel Bayesian optimization framework (termed SILBO), which finds a low-dimensional space to perform Bayesian optimization iteratively through semi-supervised dimension reduction. SILBO incorporates both labeled points and unlabeled points acquired from the acquisition function to guide the embedding space learning. To accelerate the learning procedure, we present a randomized method for generating the projection matrix. Furthermore, to map from the low-dimensional space to the high-dimensional original space, we propose two mapping strategies: $\text{SILBO}_{FZ}$ and $\text{SILBO}_{FX}$ according to the evaluation overhead of the objective function. Experimental results on both synthetic function and hyperparameter optimization tasks demonstrate that SILBO outperforms the existing state-of-the-art high-dimensional Bayesian optimization methods.
翻译:Bayesian 优化是优化昂贵黑盒功能的一种广泛应用的方法。 尽管它取得了成功, 它仍然面临着高维搜索空间的挑战。 为了缓解这一问题, 我们提议了一个新的Bayesian优化框架( 包括 SILBO), 它发现一个低维空间, 通过半监督的尺寸减少来迭接地进行Bayesian优化。 SILBO 包含标签点和从获取功能获得的无标签点, 以指导嵌入空间学习。 为了加速学习程序, 我们提出了一个生成投影矩阵的随机化方法。 此外, 为了从低维空间绘制地图到高维原始空间, 我们根据目标功能的评估管理, 我们提出了两个绘图战略: $\ text{ SILBO ⁇ F ⁇ $ 和$\ text{ SILBO ⁇ FX} 。 合成功能和超参数优化任务的实验结果表明, SILBO 超越了现有高维Bayesian 优化方法。