Computing a Gaussian process (GP) posterior has a computational cost cubical in the number of historical points. A reformulation of the same GP posterior highlights that this complexity mainly depends on how many \emph{unique} historical points are considered. This can have important implication in active learning settings, where the set of historical points is constructed sequentially by the learner. We show that sequential black-box optimization based on GPs (GP-Opt) can be made efficient by sticking to a candidate solution for multiple evaluation steps and switch only when necessary. Limiting the number of switches also limits the number of unique points in the history of the GP. Thus, the efficient GP reformulation can be used to exactly and cheaply compute the posteriors required to run the GP-Opt algorithms. This approach is especially useful in real-world applications of GP-Opt with high switch costs (e.g. switching chemicals in wet labs, data/model loading in hyperparameter optimization). As examples of this meta-approach, we modify two well-established GP-Opt algorithms, GP-UCB and GP-EI, to switch candidates as infrequently as possible adapting rules from batched GP-Opt. These versions preserve all the theoretical no-regret guarantees while improving practical aspects of the algorithms such as runtime, memory complexity, and the ability of batching candidates and evaluating them in parallel.
翻译:计算 Gausian 进程( GP- Opt) 的后端, 具有历史点数的计算成本立方数。 重订同样的 GP 后端显示, 这一复杂性主要取决于考虑多少个 emph{unique} 历史点。 这可能对积极的学习环境产生重要影响, 即由学习者按顺序构造一套历史点。 我们显示, 基于 GP( GP- Opt) 的连续黑盒优化可以通过坚持多个评价步骤的候选解决方案, 并仅在必要情况下切换来提高效率。 限制开关的数量还限制了 GPG 历史上独特的点数 。 因此, 高效的 GP 重订可以用来准确和廉价地拼写运行 GP- Opt 算法所需的后台数。 这种方法对于以高开关成本( 例如, 将化学品在湿实验室中转接, 数据/ 模版加载在超常参数优化中) 的GP- Optal- adal assal- assal assal- assal- complainal- complain assal- assal- sal- sal- supal- suplevational- pal- suplationslevationslationslationslevationslationslationslationslupal- sqslationslviolviewsssssssssslationslationslationslationslviolvicalslationslationslationslvicalslvicalsmalslvicalssssslationslationslationssssaldsalsalsalsalsalsalsalslslslsssssssssssssssssssssmsmalsssldsalsalsssssssslssssssssslsls, etsldal- GGGGGGGGGGGGGGGGGGGGGGGP- etalsssss