Most existing black-box optimization methods assume that all variables in the system being optimized have equal cost and can change freely at each iteration. However, in many real world systems, inputs are passed through a sequence of different operations or modules, making variables in earlier stages of processing more costly to update. Such structure imposes a cost on switching variables in early parts of a data processing pipeline. In this work, we propose a new algorithm for switch cost-aware optimization called Lazy Modular Bayesian Optimization (LaMBO). This method efficiently identifies the global optimum while minimizing cost through a passive change of variables in early modules. The method is theoretical grounded and achieves vanishing regret when augmented with switching cost. We apply LaMBO to multiple synthetic functions and a three-stage image segmentation pipeline used in a neuroscience application, where we obtain promising improvements over prevailing cost-aware Bayesian optimization algorithms. Our results demonstrate that LaMBO is an effective strategy for black-box optimization that is capable of minimizing switching costs in modular systems.
翻译:大部分现有的黑盒优化方法假定,正在优化的系统中的所有变量都有同等的成本,并在每个迭代时可以自由变化。 但是,在许多现实世界系统中,输入是通过不同操作或模块的顺序传递的,使处理早期阶段的变量更昂贵地更新。这种结构对数据处理管道早期部分的转换变量造成成本。在这项工作中,我们提出了一个新的切换成本意识优化算法,称为Lazy Modular Bayesian优化(LaMBO) 。这种方法有效地确定了全球最佳,同时通过早期模块变量的被动变化将成本最小化。该方法基于理论,在转换成本增加时会消除遗憾。我们将LaMBO应用于多个合成功能和神经科学应用中使用的三阶段图像分割管道,我们在那里比通用的具有成本效益的Bayesian优化算法取得了大有希望的改进。我们的结果表明,LaMBO是黑盒优化的有效战略,能够将模块系统中的切换成本最小化。