In this work we rigorously analyse assumptions inherent to black-box optimisation hyper-parameter tuning tasks. Our results on the Bayesmark benchmark indicate that heteroscedasticity and non-stationarity pose significant challenges for black-box optimisers. Based on these findings, we propose a Heteroscedastic and Evolutionary Bayesian Optimisation solver (HEBO). HEBO performs non-linear input and output warping, admits exact marginal log-likelihood optimisation and is robust to the values of lea\rned parameters. We demonstrate HEBO's empirical efficacy on the NeurIPS 2020 Black-Box Optimisation challenge, where HEBO placed first. Upon further analysis, we observe that HEBO significantly outperforms existing black-box optimisers on 108 machine learning hyperparameter tuning tasks comprising the Bayesmark benchmark. Our findings indicate that the majority of hyper-parameter tuning tasks exhibit heteroscedasticity and non-stationarity, multi-objective acquisition ensembles with Pareto front solutions improve queried configurations, and robust acquisition maximisers afford empirical advantages relative to their non-robust counterparts. We hope these findings may serve as guiding principles for practitioners of Bayesian optimisation. All code is made available at https://github.com/huawei-noah/HEBO.
翻译:在这项工作中,我们严格分析黑盒优化超参数调试任务所固有的假设。我们在拜斯马克基准的结果表明,超摄性和非常态对黑盒优化者构成重大挑战。根据这些发现,我们建议采用超摄性与进化性巴耶斯最佳化求解器(HEBO),HEBO执行非线性输入和输出扭曲任务,接受精确的边际日志类优化,并符合列亚拉尼参数的值。我们展示了HEBO在2020 NeurIPS Black-Box最佳化挑战(HEBO第一台的位置)上的经验效率。我们进一步分析后发现,HEBO大大超越了现有的黑盒优化功能,在108个机器学习由Bayesmark基准构成的超光度调整任务中,HEBO执行的非线性调整任务多数显示超正态和不固定性,与Pareto解决方案的多目标收购。我们在Pareto 解决方案前部改进了现有的对比原则,我们提供了这些不可靠的采购准则。