Time Series Classification (TSC) is an important and challenging task for many visual computing applications. Despite the extensive range of methods developed for TSC, relatively few utilized Deep Neural Networks (DNNs). In this paper, we propose two novel attention blocks (Global Temporal Attention and Temporal Pseudo-Gaussian augmented Self-Attention) that can enhance deep learning-based TSC approaches, even when such approaches are designed and optimized for a specific dataset or task. We validate this claim by evaluating multiple state-of-the-art deep learning-based TSC models on the University of East Anglia (UEA) benchmark, a standardized collection of 30 Multivariate Time Series Classification (MTSC) datasets. We show that adding the proposed attention blocks improves base models' average accuracy by up to 3.6%. Additionally, the proposed TPS block uses a new injection module to include the relative positional information in transformers. As a standalone unit with less computational complexity, it enables TPS to perform better than most of the state-of-the-art DNN-based TSC methods. The source codes for our experimental setups and proposed attention blocks are made publicly available.
翻译:时间序列分类(TSC)对于许多视觉计算应用来说是一项重要而具有挑战性的任务。尽管为TSC开发了多种先进的深神经网络(DNNs)方法,但利用的深神经网络(DNS)数量相对较少。在本文件中,我们提出两个新的关注区块(全球日间注意力和Timal Pseudo-Gauussian增强的自我关注区块),这些区块可以加强深层次的基于学习的TSC方法,即使这类方法是为特定数据集或任务设计和优化的。我们通过评估东安格利亚大学(UEA)基准的多个最先进的深学习型TSC模型来验证这一主张。30个多变时间序列(MTSC)数据集的标准化集合。我们表明,增加拟议的关注区块将使基本模型的平均精确度提高3.6%。此外,拟议的TPS区块使用一个新的注入单元将变异体中的相对定位信息纳入。作为一个不那么计算复杂性的独立单位,它使TPS能够比大多数以DNS为基础的国家关注区块更好地运行。我们提出的实验区块的源代码是公开制定的。