Bayesian Optimization is a popular tool for tuning algorithms in automatic machine learning (AutoML) systems. Current state-of-the-art methods leverage Random Forests or Gaussian processes to build a surrogate model that predicts algorithm performance given a certain set of hyperparameter settings. In this paper, we propose a new surrogate model based on gradient boosting, where we use quantile regression to provide optimistic estimates of the performance of an unobserved hyperparameter setting, and combine this with a distance metric between unobserved and observed hyperparameter settings to help regulate exploration. We demonstrate empirically that the new method is able to outperform some state-of-the art techniques across a reasonable sized set of classification problems.
翻译:Bayesian 优化是自动机器学习(AutomL)系统中调控算法的流行工具。 目前的最新方法利用随机森林或高森进程来建立替代模型,根据一定的超参数设置来预测算法性能。 在本文中,我们提出了一个基于梯度推增的新的代用模型,我们利用四分位回归来提供未观测到的超参数设置的性能的乐观估计,并结合未观测到和观测到的超参数设置之间的距离测量,以帮助调节勘探。我们从经验上证明,新方法能够超越一些合理的分类问题,超越某些最新技术。