We construct several classes of neural networks with ReLU and BiSU (Binary Step Unit) activations, which exactly emulate the lowest order Finite Element (FE) spaces on regular, simplicial partitions of polygonal and polyhedral domains $\Omega \subset \mathbb{R}^d$, $d=2,3$. For continuous, piecewise linear (CPwL) functions, our constructions generalize previous results in that arbitrary, regular simplicial partitions of $\Omega$ are admitted, also in arbitrary dimension $d\geq 2$. Vector-valued elements emulated include the classical Raviart-Thomas and the first family of N\'{e}d\'{e}lec edge elements on triangles and tetrahedra. Neural Networks emulating these FE spaces are required in the correct approximation of boundary value problems of electromagnetism in nonconvex polyhedra $\Omega \subset \mathbb{R}^3$, thereby constituting an essential ingredient in the application of e.g. the methodology of ``physics-informed NNs'' or ``deep Ritz methods'' to electromagnetic field simulation via deep learning techniques. They satisfy exact (De Rham) sequence properties, and also spawn discrete boundary complexes on $\partial\Omega$ which satisfy exact sequence properties for the surface divergence and curl operators $\mathrm{div}_\Gamma$ and $\mathrm{curl}_\Gamma$, respectively, thereby enabling ``neural boundary elements'' for computational electromagnetism. We indicate generalizations of our constructions to higher-order compatible spaces and other, non-compatible classes of discretizations in particular the Crouzeix-Raviart elements and Hybridized, Higher Order (HHO) methods.
翻译:我们用 ReLU 和 BISU (BISU Step 股) 启动若干级的神经网络, 它们完全模仿常规多角域和多环域的 Finite 元素( FE) 最小化分区, $\ Omega\ subset\ mathbb{R ⁇ d$, $d=2, $d=3$。 对于连续的、 片断线性( CPwL) 功能, 我们的构造将先前的结果概括为 $\ Omega$的任意、 普通的平面分区( $dgeq 2$ 美元) 。 被复制的 Vectral- flite 元素包括古典的 Raviart- Thomas 和多环域域域域的模拟分区 。 内, 内含这些内含精度的电磁共值的直径直线( 美元=O) 直径直径直径的直径直径可操作器, 和内含精度的内含精度的内径解的内径解的内径解的内径解法方法。