Fixed-point iterations are at the heart of numerical computing and are often a computational bottleneck in real-time applications, which typically instead need a fast solution of moderate accuracy. Classical acceleration methods for fixed-point problems focus on designing algorithms with theoretical guarantees that apply to any fixed-point problem. We present neural fixed-point acceleration, a framework to automatically learn to accelerate convex fixed-point problems that are drawn from a distribution, using ideas from meta-learning and classical acceleration algorithms. We apply our framework to SCS, the state-of-the-art solver for convex cone programming, and design models and loss functions to overcome the challenges of learning over unrolled optimization and acceleration instabilities. Our work brings neural acceleration into any optimization problem expressible with CVXPY. The source code behind this paper is available at https://github.com/facebookresearch/neural-scs
翻译:固定点的迭代是数字计算的核心,往往是实时应用中的计算瓶颈,通常需要一种中度精确的快速解决方案。固定点问题的经典加速方法侧重于设计具有理论保障的算法,这种算法适用于任何固定点问题。我们提出了神经固定点加速,这是利用元学习和经典加速算法的理念,自动学习加速从分布中得出的固定点问题的一个框架。我们将我们的框架应用到SCS,即对锥锥形锥形编程的最先进的解析器,以及设计模型和损失功能,以克服在非滚动优化和加速不稳性方面学习的挑战。我们的工作将神经加速引入任何与CVXPY可表达的优化问题。本文背后的源代码见https://github.com/facebourseresearch/neural-sc。本文的源代码见https://github. com/facebouresearch/neural-sc。