Bayesian optimization (BO) is a powerful approach for optimizing black-box, expensive-to-evaluate functions. To enable a flexible trade-off between the cost and accuracy, many applications allow the function to be evaluated at different fidelities. In order to reduce the optimization cost while maximizing the benefit-cost ratio, in this paper, we propose Batch Multi-fidelity Bayesian Optimization with Deep Auto-Regressive Networks (BMBO-DARN). We use a set of Bayesian neural networks to construct a fully auto-regressive model, which is expressive enough to capture strong yet complex relationships across all the fidelities, so as to improve the surrogate learning and optimization performance. Furthermore, to enhance the quality and diversity of queries, we develop a simple yet efficient batch querying method, without any combinatorial search over the fidelities. We propose a batch acquisition function based on Max-value Entropy Search (MES) principle, which penalizes highly correlated queries and encourages diversity. We use posterior samples and moment matching to fulfill efficient computation of the acquisition function and conduct alternating optimization over every fidelity-input pair, which guarantees an improvement at each step. We demonstrate the advantage of our approach on four real-world hyperparameter optimization applications.
翻译:Bayesian优化(BO)是优化黑箱、昂贵到评估功能的有力方法。为了在成本和准确性之间实现灵活权衡,许多应用程序允许在不同的忠诚度上对功能进行灵活权衡。为了降低优化成本,同时最大限度地实现效益成本比,我们在本文件中提议采用深自动反向网络(BBBB-DARN)实现Batch多纤维贝耶斯最佳优化。我们使用一套Bayesian神经网络来构建一个完全自动反向模型,该模型的表达面足以捕捉到所有忠诚度之间的强大而复杂的关系,从而改进代理学习和优化绩效。此外,为了提高查询的质量和多样性,我们开发了简单而高效的批次查询方法,而不对忠诚度进行任何组合式搜索。我们建议根据最大价值的Entropy搜索(MES)原则建立批量的获取功能,以惩罚高度关联性的查询并鼓励多样性。我们使用后方样本和时间匹配高效地计算购置功能,以便改进代理学习和优化性业绩。我们每个全球最佳化做法的每组都展示我们最忠实地优化做法。