Bayesian optimization (BO) is known as a powerful tool for optimizing an unknown, expensive function through querying the function values sequentially. On the other hand, in many practical problems, additional unknown constraints also need to be considered. In this paper, we propose an information-theoretic approach called Constrained Max-value Entropy Search via Information lower BOund (CMES-IBO) for the constrained BO (CBO). Although information-theoretic methods have been studied in CBO literature, they have not revealed any relation between their acquisition functions and the original mutual information. In contrast, our acquisition function is an unbiased consistent estimator of a lower bound of mutual information. We show that our CMES-IBO has several advantageous properties such as non-negativity, estimation error bounds of the acquisition function, and well-definedness of the criterion, none of which have been shown for the existing information-theoretic CBO. Furthermore, by using conditional mutual information, we extend CMES-IBO to the parallel setting in which multiple queries can be issued simultaneously. We demonstrate the effectiveness of CMES-IBO by several benchmark functions.
翻译:Bayesian优化(BO)被认为是通过相继查询函数值优化一个未知的、昂贵的功能的有力工具。 另一方面,在许多实际问题中,还需要考虑其他未知的制约因素。在本文中,我们建议为受限制的BO(CBO)采用名为“通过信息低端搜索最大值最大元素”的信息理论方法(CMES-IBO)。虽然CBO文献中已经研究了信息理论方法,但它们并未揭示其获取功能与原始相互信息之间的任何关系。相反,我们的获取功能是公正一致的对相互信息的较低约束的估算。我们表明,我们的CMES-IBO具有若干优点性,例如非强化性、估计获取功能的错误界限以及标准的明确性,而现有的信息理论 CBO(CBO)没有显示其中的任何一项。此外,我们通过使用有条件的相互信息,将CMES-IBO扩展为可同时发布多个查询的平行设置。我们通过几个基准功能展示了CMES-IBO的有效性。