We study the global convergence of the gradient descent method of the minimization of strictly convex functionals on an open and bounded set of a Hilbert space. Such results are unknown for this type of sets, unlike the case of the entire Hilbert space. The proof of this convergence is based on the classical contraction principle. Then, we use our result to establish a general framework to numerically solve boundary value problems for quasi-linear partial differential equations (PDEs) with noisy Cauchy data. The procedure involves the use of Carleman weight functions to convexify a cost functional arising from the given boundary value problem and thus to ensure the convergence of the gradient descent method above. We prove the global convergence of the method as the noise tends to 0. The convergence rate is Lipschitz. Next, we apply this method to solve a highly nonlinear and severely ill-posed coefficient inverse problem, which is the so-called back scattering inverse problem. This problem has many real-world applications. Numerical examples are presented.
翻译:我们研究在Hilbert空间的开放和封闭的一组空间上将严格混凝土函数最小化的梯度下降法的全球趋同性。 与整个Hilbert空间的情况不同, 这种结果并不为人所知。 这种趋同的证据是以经典收缩原则为依据的。 然后, 我们利用我们的结果建立一个总框架, 以数字方式解决半线性局部差异方程式(PDEs)的边界值问题, 并使用吵闹的Cauchy数据。 这个程序涉及使用 Carleman 重量函数来解析由特定边界值问题产生的成本功能, 从而确保以上梯度下降法的趋同性。 我们证明该方法的全球趋同性, 因为噪音趋向为0。 趋同率是Lipschitz。 接下来, 我们用这个方法来解决一个高度非线性和严重错误的反位系数问题, 即所谓的后向反位分散问题。 这个问题有许多真实世界应用。