This paper presents the results and insights from the black-box optimization (BBO) challenge at NeurIPS 2020 which ran from July-October, 2020. The challenge emphasized the importance of evaluating derivative-free optimizers for tuning the hyperparameters of machine learning models. This was the first black-box optimization challenge with a machine learning emphasis. It was based on tuning (validation set) performance of standard machine learning models on real datasets. This competition has widespread impact as black-box optimization (e.g., Bayesian optimization) is relevant for hyperparameter tuning in almost every machine learning project as well as many applications outside of machine learning. The final leaderboard was determined using the optimization performance on held-out (hidden) objective functions, where the optimizers ran without human intervention. Baselines were set using the default settings of several open-source black-box optimization packages as well as random search.
翻译:本文件介绍了从2020年7月至10月在NeurIPS2020年进行的黑箱优化(BBO)挑战的结果和洞察力,挑战强调了评估无衍生物优化(BBO)对于调整机器学习模型超参数的重要性。这是第一个以机器学习为着重点的黑箱优化(BBO)挑战。它基于对真实数据集的标准机器学习模型的调试(验证成套)性能。这一竞争产生了广泛的影响,因为黑箱优化(例如Bayesian优化)几乎与每个机器学习项目以及机器学习以外的许多应用的超参数调都相关。最后的首选板是使用悬置(隐藏)目标功能的优化性能来确定的,优化者在没有人手干预的情况下运行。设定了基线时使用了几个开源黑箱优化软件包的默认设置以及随机搜索。