We revisit first-order optimization under local information constraints such as local privacy, gradient quantization, and computational constraints limiting access to a few coordinates of the gradient. In this setting, the optimization algorithm is not allowed to directly access the complete output of the gradient oracle, but only gets limited information about it subject to the local information constraints. We study the role of adaptivity in processing the gradient output to obtain this limited information from it.We consider optimization for both convex and strongly convex functions and obtain tight or nearly tight lower bounds for the convergence rate, when adaptive gradient processing is allowed. Prior work was restricted to convex functions and allowed only nonadaptive processing of gradients. For both of these function classes and for the three information constraints mentioned above, our lower bound implies that adaptive processing of gradients cannot outperform nonadaptive processing in most regimes of interest. We complement these results by exhibiting a natural optimization problem under information constraints for which adaptive processing of gradient strictly outperforms nonadaptive processing.
翻译:我们根据当地信息限制,例如当地隐私、梯度量化和限制访问梯度几个坐标的计算限制,重新审视一级优化。在此背景下,优化算法不允许直接访问梯度或角的全部输出,但仅获得有限的信息,但受当地信息限制。我们研究梯度输出处理适应性的作用,以便从中获取这种有限的信息。我们考虑对顺流和强顺流功能进行优化,并在允许适应性梯度处理时,获得紧凑或近乎紧凑的汇合率下限。以前的工作仅限于 convex 函数,只允许不适应性地处理梯度。对于这两个功能类别和上述三种信息限制,我们的下限意味着适应性梯度处理不能超过大多数利益体系的不适应性处理。我们通过在适应性梯度处理严格优于不适应性处理的信息制约下展示自然优化问题来补充这些结果。