Sequential model-based optimization sequentially selects a candidate point by constructing a surrogate model with the history of evaluations, to solve a black-box optimization problem. Gaussian process (GP) regression is a popular choice as a surrogate model, because of its capability of calculating prediction uncertainty analytically. On the other hand, an ensemble of randomized trees is another option and has practical merits over GPs due to its scalability and easiness of handling continuous/discrete mixed variables. In this paper we revisit various ensembles of randomized trees to investigate their behavior in the perspective of prediction uncertainty estimation. Then, we propose a new way of constructing an ensemble of randomized trees, referred to as BwO forest, where bagging with oversampling is employed to construct bootstrapped samples that are used to build randomized trees with random splitting. Experimental results demonstrate the validity and good performance of BwO forest over existing tree-based models in various circumstances.
翻译:以序列为基础的模型优化按顺序选择一个候选点, 构建具有评价历史的替代模型, 以解决黑盒优化问题 。 Gossian 进程( GP) 回归是一个流行的替代模型, 因为它能够分析预测不确定性。 另一方面, 随机化树木的组合是另一个选项, 并且由于可缩放性以及处理连续/ 分解混合变量的易容性, 对 GP 具有实际优点 。 在本文中, 我们重新审视了各种随机化树群, 从预测不确定性估计的角度来调查它们的行为 。 然后, 我们提出一种新的方法, 构建随机化树群, 称为 BwO 森林, 在那里, 使用过度采样来建造用来随机分割的随机化树。 实验结果显示 BwO 森林在不同情况下对现有树种模型的有效性和良好性能 。