Black-box optimization (BBO) algorithms are concerned with finding the best solutions for problems with missing analytical details. Most classical methods for such problems are based on strong and fixed \emph{a priori} assumptions, such as Gaussian distribution. However, the complex real-world problems, especially when the global optimum is desired, could be very far from the \emph{a priori} assumptions because of their diversities, bringing some unexpected obstacles to these methods. In this paper, we present an optimizer using generative adversarial nets (OPT-GAN) to adapt to diverse black-box problems via estimating the distribution of optima. The method learns the extensive distribution of the optimal region dominated by selective and randomly moving candidates, balancing the exploration and exploitation. Experiments demonstrate that on BBOB problems and several other benchmarks with atypical distributions, OPT-GAN outperforms other classical BBO algorithms, in particular the ones with Gaussian assumptions.
翻译:黑盒优化(BBO)算法涉及寻找解决缺少分析细节问题的最佳办法。这类问题的大多数典型方法都基于坚固且固定的 emph{a sidi} 假设,如高森分布。然而,复杂的现实世界问题,特别是当全球最佳时,由于其多样性,可能远未达到黑盒优化(BBOB)假设,给这些方法带来一些意外障碍。在本文中,我们提出了一个优化方法,利用基因对抗网(OPT-GAN)来通过估计opima的分布来适应不同的黑盒问题。这种方法了解由选择性和随机移动的候选人主导的最佳区域的广泛分布,平衡探索和开发。实验表明,在BBOB问题和其他一些非典型分布基准上,OF-GAN优于其他古典BO算法,特别是高斯假设法。