We study the approximation properties of convolutional architectures applied to time series modelling, which can be formulated mathematically as a functional approximation problem. In the recurrent setting, recent results reveal an intricate connection between approximation efficiency and memory structures in the data generation process. In this paper, we derive parallel results for convolutional architectures, with WaveNet being a prime example. Our results reveal that in this new setting, approximation efficiency is not only characterised by memory, but also additional fine structures in the target relationship. This leads to a novel definition of spectrum-based regularity that measures the complexity of temporal relationships under the convolutional approximation scheme. These analyses provide a foundation to understand the differences between architectural choices for time series modelling and can give theoretically grounded guidance for practical applications.
翻译:我们研究了适用于时间序列建模的革命结构的近似特性,可以用数学方法将其表述为功能近似问题。在经常情况下,最近的结果揭示了数据生成过程中近似效率和记忆结构之间的复杂联系。在本文中,我们为革命结构得出平行结果,WaveNet是一个主要例子。我们的结果显示,在这一新环境中,近似效率不仅以记忆为特征,而且在目标关系中还以其他精细结构为特征。这导致对基于频谱的规律性作出新的定义,以测量在革命近似计划下的时间关系的复杂性。这些分析为理解时间序列建模的建筑选择之间的差异提供了基础,并为实际应用提供了理论上的指导。