The receptive field (RF), which determines the region of time series to be ``seen'' and used, is critical to improve the performance for time series classification (TSC). However, the variation of signal scales across and within time series data, makes it challenging to decide on proper RF sizes for TSC. In this paper, we propose a dynamic sparse network (DSN) with sparse connections for TSC, which can learn to cover various RF without cumbersome hyper-parameters tuning. The kernels in each sparse layer are sparse and can be explored under the constraint regions by dynamic sparse training, which makes it possible to reduce the resource cost. The experimental results show that the proposed DSN model can achieve state-of-art performance on both univariate and multivariate TSC datasets with less than 50\% computational cost compared with recent baseline methods, opening the path towards more accurate resource-aware methods for time series analyses. Our code is publicly available at: https://github.com/QiaoXiao7282/DSN.
翻译:开放域(RF)决定“Searn'''和使用的时序区域,对于改进时间序列分类(TSC)的性能至关重要。然而,在时间序列数据之间和时间序列数据内部的信号尺度变化,使得难以决定TSC的适当的RF大小。在本文件中,我们提议建立一个动态的分散网络,使TSC连接稀少,可以学习如何覆盖各种RF,而不必进行繁琐的超参数调试。每个稀疏层的内核是稀少的,可以通过动态稀释培训在限制区下探索,从而降低资源成本。实验结果显示,提议的DSN模型可以在单向和多变量的TSC数据集上实现最先进的性能,与最近的基线方法相比,计算成本不到50 ⁇,从而打开了通往更精确的资源-瓦雷方法以进行时间序列分析的道路。我们的代码公布在https://github.com/QiaXiao7282/DSSN。