We review various characterizations of uniform convexity and smoothness on norm balls in finite-dimensional spaces and connect results stemming from the geometry of Banach spaces with \textit{scaling inequalities} used in analysing the convergence of optimization methods. In particular, we establish local versions of these conditions to provide sharper insights on a recent body of complexity results in learning theory, online learning, or offline optimization, which rely on the strong convexity of the feasible set. While they have a significant impact on complexity, these strong convexity or uniform convexity properties of feasible sets are not exploited as thoroughly as their functional counterparts, and this work is an effort to correct this imbalance. We conclude with some practical examples in optimization and machine learning where leveraging these conditions and localized assumptions lead to new complexity results.
翻译:我们审查了对有限维度空间规范球统一和平稳的各种特征,并将在分析优化方法趋同时使用的Banach空间的几何与\textit{scalsing equality}结果联系起来。特别是,我们建立这些条件的本地版本,以便更清晰地了解最近在学习理论、在线学习或离线优化方面一系列复杂结果,这些结果依赖于一套可行假设的强烈共性。虽然它们对复杂性有重大影响,但各种可行组合的强稳性或统一共性特性并没有像对等功能那样得到彻底的利用,而这项工作是纠正这种不平衡的努力。我们最后以优化和机器学习方面的一些实际例子来总结这些条件和局部假设如何利用这些条件和局部假设导致新的复杂结果。