In this work, we study the problem of global optimization in univariate loss functions, where we analyze the regret of the popular lower bounding algorithms (e.g., Piyavskii-Shubert algorithm). For any given time $T$, instead of the widely available simple regret (which is the difference of the losses between the best estimation up to $T$ and the global optimizer), we study the cumulative regret up to that time. With a suitable lower bounding algorithm, we show that it is possible to achieve satisfactory cumulative regret bounds for different classes of functions. For Lipschitz continuous functions with the parameter $L$, we show that the cumulative regret is $O(L\log T)$. For Lipschitz smooth functions with the parameter $H$, we show that the cumulative regret is $O(H)$. We also analytically extend our results for a broader class of functions that covers both the Lipschitz continuous and smooth functions individually.
翻译:在这项工作中,我们研究了单项损失函数的全球优化问题,我们分析了流行的低约束算法的遗憾(例如Piyavskii-Shubert算法)。在任何特定时间,我们研究的是直到那时为止的累积遗憾(即最高至T$的最佳估计值与全球最佳优化值之间的损失差别),我们分析了单项损失函数的全球优化问题。有了适当的低约束算法,我们表明,对于不同类别的功能,有可能达到令人满意的累积遗憾界限。对于使用参数$L$的Libschitz连续函数,我们表明累积的遗憾是$O(L\log T)$。对于使用参数的Libschitz平稳函数,我们表明,与参数$H$的累积遗憾是$(H),我们显示累积的遗憾是$O(H)美元。我们还分析地将我们的结果扩大到一个范围更广的功能类别,既包括Lipschitz连续功能,也包括光滑函数。