We propose to use the {\L}ojasiewicz inequality as a general tool for analyzing the convergence rate of gradient descent on a Hilbert manifold, without resorting to the continuous gradient flow. Using this tool, we show that a Sobolev gradient descent method with adaptive inner product converges exponentially fast to the ground state for the Gross-Pitaevskii eigenproblem. This method can be extended to a class of general high-degree optimizations or nonlinear eigenproblems under certain conditions. We demonstrate this generalization by several examples, in particular a nonlinear Schr\"odinger eigenproblem with an extra high-order interaction term. Numerical experiments are presented for these problems.
翻译:我们建议使用 {L}ojasiewicz 不平等作为分析Hilbert 元件上梯度下降趋同率的一般工具, 而不使用连续梯度流。 我们使用这个工具显示, 带有适应性内产物的 Sobolev 梯度下降法在Gros- Pitaevskii egenproblem 中快速接近地面状态。 在某些条件下, 这种方法可以推广到普通高度优化或非线性乙型问题。 我们通过几个例子, 特别是非线性Schr\“ odiginger egenproblem ” 和 额外高端互动术语来证明这种概括性。 为这些问题提供了大量实验。