Algorithms increasingly operate within complex physical, social, and engineering systems where they are exposed to disturbances, noise, and interconnections with other dynamical systems. This article extends known convergence guarantees of an algorithm operating in isolation (i.e., without disturbances) and systematically derives stability bounds and convergence rates in the presence of such disturbances. By leveraging converse Lyapunov theorems, we derive key inequalities that quantify the impact of disturbances. We further demonstrate how our result can be utilized to assess the effects of disturbances on algorithmic performance in a wide variety of applications, including communication constraints in distributed learning, sensitivity in machine learning generalization, and intentional noise injection for privacy. This underpins the role of our result as a unifying tool for algorithm analysis in the presence of noise, disturbances, and interconnections with other dynamical systems.
翻译:算法日益在复杂的物理、社会与工程系统中运行,这些系统使其暴露于扰动、噪声以及与其他动态系统的相互关联之中。本文扩展了孤立运行(即无扰动条件下)算法的已知收敛性保证,并系统性地推导了存在此类扰动时的稳定性界与收敛速率。通过利用逆李雅普诺夫定理,我们推导出量化扰动影响的关键不等式。我们进一步展示了如何运用该结果评估扰动在多种应用场景中对算法性能的影响,包括分布式学习中的通信约束、机器学习泛化中的敏感性,以及为隐私保护而故意注入的噪声。这奠定了我们的结果作为在噪声、扰动及与其他动态系统相互关联条件下进行算法分析的统一工具的基础作用。