We study iterated vector fields and investigate whether they are conservative, in the sense that they are the gradient of some scalar-valued function. We analyze the conservatism of various iterated vector fields, including gradient vector fields associated to loss functions of generalized linear models. We relate this study to optimization and derive novel convergence results for federated learning algorithms. In particular, we show that for certain classes of functions (including non-convex functions), federated averaging is equivalent to gradient descent on a surrogate loss function. Finally, we discuss a variety of open questions spanning topics in geometry, dynamical systems, and optimization.
翻译:我们研究迭代矢量字段,并调查它们是否保守,即它们是某种天平价值函数的梯度。我们分析了各种迭代矢量字段的保守性,包括与通用线性模型损失功能相关的梯度矢量字段。我们把这项研究与优化联系起来,并为联结学习算法得出新的趋同结果。特别是,我们表明,对于某些类别的功能(包括非convex函数),联邦平均值相当于代位损失函数的梯度下降。最后,我们讨论了涉及几何、动态系统和优化等专题的各种未决问题。