The tuning of hyperparameters becomes increasingly important as machine learning (ML) models have been extensively applied in data mining applications. Among various approaches, Bayesian optimization (BO) is a successful methodology to tune hyper-parameters automatically. While traditional methods optimize each tuning task in isolation, there has been recent interest in speeding up BO by transferring knowledge across previous tasks. In this work, we introduce an automatic method to design the BO search space with the aid of tuning history from past tasks. This simple yet effective approach can be used to endow many existing BO methods with transfer learning capabilities. In addition, it enjoys the three advantages: universality, generality, and safeness. The extensive experiments show that our approach considerably boosts BO by designing a promising and compact search space instead of using the entire space, and outperforms the state-of-the-arts on a wide range of benchmarks, including machine learning and deep learning tuning tasks, and neural architecture search.
翻译:随着机器学习(ML)模型在数据开采应用中广泛应用,对超参数的调整变得日益重要。在各种方法中,贝叶斯优化(BO)是自动调节超参数的成功方法。传统方法在孤立地优化了每个调整任务,但最近人们有兴趣通过在以往任务之间转让知识来加快BO。在这项工作中,我们引入了一种自动方法来设计BO搜索空间,帮助将历史与过去的任务相调和。这一简单而有效的方法可以用来使许多现有的BO方法具有转移学习能力。此外,它享有三个优势:普遍性、普遍性和安全性。广泛的实验表明,我们的方法通过设计一个有希望的和紧凑的搜索空间而不是使用整个空间,大大提升BO,并且超越了在一系列基准上(包括机器学习和深层学习调制任务)和神经结构搜索方面的状态。