We propose a practical Bayesian optimization method over sets, to minimize a black-box function that takes a set as a single input. Because set inputs are permutation-invariant, traditional Gaussian process-based Bayesian optimization strategies which assume vector inputs can fall short. To address this, we develop a Bayesian optimization method with \emph{set kernel} that is used to build surrogate functions. This kernel accumulates similarity over set elements to enforce permutation-invariance, but this comes at a greater computational cost. To reduce this burden, we propose two key components: (i) a more efficient approximate set kernel which is still positive-definite and is an unbiased estimator of the true set kernel with upper-bounded variance in terms of the number of subsamples, (ii) a constrained acquisition function optimization over sets, which uses symmetry of the feasible region that defines a set input. Finally, we present several numerical experiments which demonstrate that our method outperforms other methods.
翻译:我们提出一套实用的贝叶斯优化方法, 以将黑盒功能最小化为一组输入。 因为设定的输入是变异的, 传统的高斯进程优化策略, 假设矢量输入可能不尽人意。 为了解决这个问题, 我们用\emph{ set 内核} 开发一种用于构建代理功能的巴伊西亚优化方法。 这个内核在设定元素上积累相似性, 以强制调异性, 但这是以更高的计算成本得出的。 为了减轻这一负担, 我们建议了两个关键组成部分:(i) 一个效率更高的约好的约好内核, 它仍然是正定的, 是真实的设定内核的公正估计值, 在子标码数量上存在上的差异, (ii) 一个受限制的获取功能优化, 它使用定义设定输入的可行区域的对称性。 最后, 我们提出几个数字实验, 表明我们的方法比其他方法更符合要求 。