We constructively show, via rigorous mathematical arguments, that GNN architectures outperform those of NN in approximating bandlimited functions on compact $d$-dimensional Euclidean grids. We show that the former only need $\mathcal{M}$ sampled functional values in order to achieve a uniform approximation error of $O_{d}(2^{-\mathcal{M}^{1/d}})$ and that this error rate is optimal, in the sense that, NNs might achieve worse.
翻译:我们通过严格的数学论据建设性地表明,GNN的架构比NN的架构表现得要好,它几乎是欧洲cliidean网格上固定带带的有限功能。我们显示,前者只需要美元(mathcal{M})的抽样功能值,才能达到统一近似差值($O*d}(2 ⁇ -mathcal{M ⁇ 1/d ⁇ ),而这一误差率是最佳的,也就是说,NNP可能做得更糟。