While Transformers have demonstrated remarkable potential in modeling Partial Differential Equations (PDEs), modeling large-scale unstructured meshes with complex geometries remains a significant challenge. Existing efficient architectures often employ feature dimensionality reduction strategies, which inadvertently induces Geometric Aliasing, resulting in the loss of critical physical boundary information. To address this, we propose the Physics-Geometry Operator Transformer (PGOT), designed to reconstruct physical feature learning through explicit geometry awareness. Specifically, we propose Spectrum-Preserving Geometric Attention (SpecGeo-Attention). Utilizing a ``physics slicing-geometry injection" mechanism, this module incorporates multi-scale geometric encodings to explicitly preserve multi-scale geometric features while maintaining linear computational complexity $O(N)$. Furthermore, PGOT dynamically routes computations to low-order linear paths for smooth regions and high-order non-linear paths for shock waves and discontinuities based on spatial coordinates, enabling spatially adaptive and high-precision physical field modeling. PGOT achieves consistent state-of-the-art performance across four standard benchmarks and excels in large-scale industrial tasks including airfoil and car designs.
翻译:尽管Transformer在建模偏微分方程(PDEs)方面展现出显著潜力,但针对具有复杂几何结构的大规模非结构化网格建模仍是一个重大挑战。现有高效架构常采用特征降维策略,这无意中会引发几何混叠现象,导致关键物理边界信息丢失。为解决这一问题,我们提出了物理-几何算子Transformer(PGOT),其设计目标是通过显式的几何感知重建物理特征学习。具体而言,我们提出了频谱保持几何注意力机制(SpecGeo-Attention)。该模块采用“物理切片-几何注入”机制,通过融入多尺度几何编码来显式保持多尺度几何特征,同时维持线性计算复杂度$O(N)$。此外,PGOT能根据空间坐标动态将计算路由至低阶线性路径(适用于平滑区域)和高阶非线性路径(适用于激波与间断区域),从而实现空间自适应的高精度物理场建模。PGOT在四个标准基准测试中均取得了一致的先进性能,并在翼型与汽车设计等大规模工业任务中表现卓越。