A merger of two optimization frameworks is introduced: SEquential Subspace OPtimization (SESOP) with MultiGrid (MG) optimization. At each iteration of the algorithm, the search direction implied by the coarse-grid correction process of MG is added to the low dimensional search-space of SESOP, which includes the preconditioned gradient and search directions involving the previous iterates, called {\em history}. Numerical experiments demonstrate the effectiveness of this approach. We then study the asymptotic convergence factor of the two-level version of SESOP-MG (dubbed SESOP-TG) for optimization of quadratic functions, and derive approximately optimal fixed parameters, which may reduce the computational overhead for such problems significantly.
翻译:引入了两个优化框架的合并: 二次空间优化( SESOP) 和多格里德优化( MG) 。 在算法的每一次迭代中, MG 粗格校正过程所隐含的搜索方向被添加到 SESOP 的低维搜索空间中, 其中包括前一次迭代( 称为 \ em历史 ) 的前提条件梯度和搜索方向 。 数字实验证明了这一方法的有效性 。 然后我们研究了 SESOP- MG ( dubbbed SESOP- TG) 两级优化四边形功能的无症状趋同系数, 并得出了大约最佳的固定参数, 这可能会显著减少此类问题的计算间接费用 。