We study the approximation of multivariate functions with tensor networks (TNs). The main conclusion of this work is an answer to the following two questions: "What are the approximation capabilities of TNs?" and "What is an appropriate model class of functions that can be approximated with TNs?" To answer the former: we show that TNs can (near to) optimally replicate $h$-uniform and $h$-adaptive approximation, for any smoothness order of the target function. Tensor networks thus exhibit universal expressivity w.r.t. isotropic, anisotropic and mixed smoothness spaces that is comparable with more general neural networks families such as deep rectified linear unit (ReLU) networks. Put differently, TNs have the capacity to (near to) optimally approximate many function classes -- without being adapted to the particular class in question. To answer the latter: as a candidate model class we consider approximation classes of TNs and show that these are (quasi-)Banach spaces, that many types of classical smoothness spaces are continuously embedded into said approximation classes and that TN approximation classes are themselves not embedded in any classical smoothness space.
翻译:我们研究的是与强尔网络(TNs)的多变量函数近似关系。 这项工作的主要结论是对以下两个问题的答案:“ 热电图的近似能力是什么? ” 和“ 什么样的是能够与 TNs 网络相近的适当模型功能类别?” 回答前者: 我们显示, TNs 可以( 接近) 最佳地复制$h$-uniforform 和$h$-adaptive 近似, 以适应目标函数的任何顺畅顺序。 Tansor 网络因此展示了普遍的表达性 w.r.t. 异地、 异地和混合的光滑空间, 与更普通的神经网络家庭如深修整线性单元(ReLU) 网络具有可比性。 换句话说, TNs 有能力( 接近) 优化地接近许多功能类别, 而不适应特定类别。 回答后一种情况: 作为候选模型, 我们考虑的是TNs近似等级, 并显示这些是( qus-) Banach 空间, 许多种类的古典光滑空间不断嵌入上述近似的类。