We study the approximation by tensor networks (TNs) of functions from classical smoothness classes. The considered approximation tool combines a tensorization of functions in $L^p([0,1))$, which allows to identify a univariate function with a multivariate function (or tensor), and the use of tree tensor networks (the tensor train format) for exploiting low-rank structures of multivariate functions. The resulting tool can be interpreted as a feed-forward neural network, with first layers implementing the tensorization, interpreted as a particular featuring step, followed by a sum-product network with sparse architecture. In part I of this work, we presented several approximation classes associated with different measures of complexity of tensor networks and studied their properties. In this work (part II), we show how classical approximation tools, such as polynomials or splines (with fixed or free knots), can be encoded as a tensor network with controlled complexity. We use this to derive direct (Jackson) inequalities for the approximation spaces of tensor networks. This is then utilized to show that Besov spaces are continuously embedded into these approximation spaces. In other words, we show that arbitrary Besov functions can be approximated with optimal or near to optimal rate. We also show that an arbitrary function in the approximation class possesses no Besov smoothness, unless one limits the depth of the tensor network.
翻译:我们研究来自古典光滑等级的“ 高压网络” 功能的近似值。 被考虑的“ 近似” 工具结合了以$L ⁇ p( 0. 1) $( 0. 1) $( 0. 1) 中功能的强制值。 该工具可以确定一个具有多种变量功能( 或 straor) 的单向类函数, 并使用“ 树压网络( 高压火车格式) ” 来利用多变量函数的低端结构。 由此产生的工具可以被解读成一个“ 向导神经网络”, 其第一层实施“ 向导神经网络”, 并被解释为一个特定功能, 之后是一个与稀疏结构相匹配的产品网络的合成网络。 在这项工作的第一部分中, 我们展示了几个与高频网络的复杂度相关的近似类别, 并研究其特性。 在这项工作( 第二部分) 中, 我们展示了典型的“ 多式组合” 或“ 螺纹” ( 有固定或免费的结) 等结构, 如何定位 。