Partial Differential Equations (PDEs) are integral to modeling many scientific and engineering problems. Physics-informed Neural Networks (PINNs) have emerged as promising tools for solving PDEs by embedding governing equations into the neural network loss function. However, when dealing with PDEs characterized by strong oscillatory dynamics over large computational domains, PINNs based on Multilayer Perceptrons (MLPs) often exhibit poor convergence and reduced accuracy. To address these challenges, this paper introduces Scaled-cPIKAN, a physics-informed architecture rooted in Kolmogorov-Arnold Networks (KANs). Scaled-cPIKAN integrates Chebyshev polynomial representations with a domain scaling approach that transforms spatial variables in PDEs into the standardized domain \([-1,1]^d\), as intrinsically required by Chebyshev polynomials. By combining the flexibility of Chebyshev-based KANs (cKANs) with the physics-driven principles of PINNs, and the spatial domain transformation, Scaled-cPIKAN enables efficient representation of oscillatory dynamics across extended spatial domains while improving computational performance. We demonstrate Scaled-cPIKAN efficacy using four benchmark problems: the diffusion equation, the Helmholtz equation, the Allen-Cahn equation, as well as both forward and inverse formulations of the reaction-diffusion equation (with and without noisy data). Our results show that Scaled-cPIKAN significantly outperforms existing methods in all test cases. In particular, it achieves several orders of magnitude higher accuracy and faster convergence rate, making it a highly efficient tool for approximating PDE solutions that feature oscillatory behavior over large spatial domains.
翻译:暂无翻译