Randomized neural networks (RaNNs), in which hidden layers remain fixed after random initialization, provide an efficient alternative for parameter optimization compared to fully parameterized networks. In this paper, RaNNs are integrated with overlapping Schwarz domain decomposition in two (main) ways: first, to formulate the least-squares problem with localized basis functions, and second, to construct overlapping preconditioners for the resulting linear systems. In particular, neural networks are initialized randomly in each subdomain based on a uniform distribution and linked through a partition of unity, forming a global solution that approximates the solution of the partial differential equation. Boundary conditions are enforced through a constraining operator, eliminating the need for a penalty term to handle them. Principal component analysis (PCA) is employed to reduce the number of basis functions in each subdomain, yielding a linear system with a lower condition number. By constructing additive and restricted additive Schwarz preconditioners, the least-squares problem is solved efficiently using the Conjugate Gradient (CG) and Generalized Minimal Residual (GMRES) methods, respectively. Our numerical results demonstrate that the proposed approach significantly reduces computational time for multi-scale and time-dependent problems. Additionally, a three-dimensional problem is presented to demonstrate the efficiency of using the CG method with an AS preconditioner, compared to an QR decomposition, in solving the least-squares problem.
翻译:暂无翻译