Bayesian optimization is a powerful paradigm to optimize black-box functions based on scarce and noisy data. Its data efficiency can be further improved by transfer learning from related tasks. While recent transfer models meta-learn a prior based on large amount of data, in the low-data regime methods that exploit the closed-form posterior of Gaussian processes (GPs) have an advantage. In this setting, several analytically tractable transfer-model posteriors have been proposed, but the relative advantages of these methods are not well understood. In this paper, we provide a unified view on hierarchical GP models for transfer learning, which allows us to analyze the relationship between methods. As part of the analysis, we develop a novel closed-form boosted GP transfer model that fits between existing approaches in terms of complexity. We evaluate the performance of the different approaches in large-scale experiments and highlight strengths and weaknesses of the different transfer-learning methods.
翻译:Bayesian 优化是优化基于稀缺和繁杂数据的黑盒功能的强大范例。 通过从相关任务中传授知识,可以进一步提高其数据效率。虽然最近的传输模型元精是一个以前基于大量数据的低数据系统方法,但利用高山流程的封闭式后遗症(GPs)有一个优势。在这一背景下,提出了若干可分析的转移模型后遗症,但这些方法的相对优势没有得到很好的理解。在本文中,我们提供了对用于转移学习的等级性GP模型的统一观点,从而使我们能够分析方法之间的关系。作为分析的一部分,我们开发了一种新的封闭式强化的GP转移模型,在复杂性方面适合现有方法。我们评估了大规模实验中不同方法的绩效,并强调了不同转移学习方法的优点和弱点。