Bayesian optimization (BO) is a popular paradigm for global optimization of expensive black-box functions, but there are many domains where the function is not completely black-box. The data may have some known structure, e.g. symmetries, and the data generation process can yield useful intermediate or auxiliary information in addition to the value of the optimization objective. However, surrogate models traditionally employed in BO, such as Gaussian Processes (GPs), scale poorly with dataset size and struggle to incorporate known structure or auxiliary information. Instead, we propose performing BO on complex, structured problems by using Bayesian Neural Networks (BNNs), a class of scalable surrogate models that have the representation power and flexibility to handle structured data and exploit auxiliary information. We demonstrate BO on a number of realistic problems in physics and chemistry, including topology optimization of photonic crystal materials using convolutional neural networks, and chemical property optimization of molecules using graph neural networks. On these complex tasks, we show that BNNs often outperform GPs as surrogate models for BO in terms of both sampling efficiency and computational cost.
翻译:Bayesian优化(BO)是全球优化昂贵黑盒功能的流行范例,但有许多领域,该功能并非完全黑盒。数据可能有一些已知的结构,例如对称,数据生成过程除了最优化目标的价值外,还可以产生有用的中间或辅助信息。然而,BO传统上使用的代用模型,如Gaussian 进程(GPs),其规模与数据集大小不相称,难以纳入已知的结构或辅助信息。相反,我们提议使用Bayesian神经网络(BNNS)来就复杂和结构化的问题执行BO。BNS是一组可缩放的代用模型,具有处理结构化数据和利用辅助信息的代表性和灵活性。我们向BO展示了物理和化学方面的一些现实问题,包括利用革命神经网络对光晶材料进行表面优化,以及利用图形神经网络对分子进行化学属性优化。关于这些复杂的任务,我们表明BNS常常在取样效率和计算成本方面超越BO的代用GP模型。