The key to Black-Box Optimization is to efficiently search through input regions with potentially widely-varying numerical properties, to achieve low-regret descent and fast progress toward the optima. Monte Carlo Tree Search (MCTS) methods have recently been introduced to improve Bayesian optimization by computing better partitioning of the search space that balances exploration and exploitation. Extending this promising framework, we study how to further integrate sample-based descent for faster optimization. We design novel ways of expanding Monte Carlo search trees, with new descent methods at vertices that incorporate stochastic search and Gaussian Processes. We propose the corresponding rules for balancing progress and uncertainty, branch selection, tree expansion, and backpropagation. The designed search process puts more emphasis on sampling for faster descent and uses localized Gaussian Processes as auxiliary metrics for both exploitation and exploration. We show empirically that the proposed algorithms can outperform state-of-the-art methods on many challenging benchmark problems.
翻译:黑牛最佳化的关键在于通过具有潜在广泛变化的数字属性的投入区,高效率地搜索,实现低风险下降和快速迈向optima。蒙特卡洛树搜索(MCTS)方法最近被引入,通过更好地计算平衡勘探和开发的搜索空间的分隔来改善贝叶斯优化。我们研究了如何进一步整合基于样本的下降,以更快优化。我们设计了扩大蒙特卡洛搜索树的新方法,在包括随机搜索和高斯进程在内的顶端采用新的下降方法。我们提出了平衡进展和不确定性、分支选择、树种扩张和反再造的相应规则。设计搜索过程更加强调采样,以更快的下降和局部高斯进程作为勘探和开采的辅助指标。我们从经验上表明,拟议的算法可以超越许多具有挑战性的基准问题方面的先进方法。