We introduce a novel spectral, finite-dimensional approximation of general Sobolev spaces in terms of Chebyshev polynomials. Based on this polynomial surrogate model (PSM), we realise a variational formulation, solving a vast class of linear and non-linear partial differential equations (PDEs). The PSMs are as flexible as the physics-informed neural nets (PINNs) and provide an alternative for addressing inverse PDE problems, such as PDE-parameter inference. In contrast to PINNs, the PSMs result in a convex optimisation problem for a vast class of PDEs, including all linear ones, in which case the PSM-approximate is efficiently computable due to the exponential convergence rate of the underlying variational gradient descent. As a practical consequence prominent PDE problems were resolved by the PSMs without High Performance Computing (HPC) on a local machine. This gain in efficiency is complemented by an increase of approximation power, outperforming PINN alternatives in both accuracy and runtime. Beyond the empirical evidence we give here, the translation of classic PDE theory in terms of the Sobolev space approximates suggests the PSMs to be universally applicable to well-posed, regular forward and inverse PDE problems.
翻译:我们引入了切比谢夫多式替代模型(PSM)中一般Sobolev空间的新型光谱、有限维度近似值。基于这一多元替代模型(PSM),我们实现了一种变式配方,解决了一大批线性和非线性部分方程(PDEs ) 。PSM和物理知情神经网(PINNS)一样灵活,为解决反向PDE问题提供了替代方法,如PDE参数推算。与PINNs相比,PSM对一大批类PDE(包括所有线性替代模型)造成了共振优化问题。在这种情况下,由于基本变差梯度下降的指数趋同率(PDE),PSM(PDA)的近似性可有效兼容性可调和。由于当地机器的PSM(PDA)没有高性能计算仪(HPC)解决了显著的PDE问题的实际结果,因此效率的提高得到补充,因为近似性力量在准确性和运行时间上都超过了PINN的替代方法。在此,我们给SM的常规空间理论的近似翻译。