Asynchronous Time Series is a multivariate time series where all the channels are observed asynchronously-independently, making the time series extremely sparse when aligning them. We often observe this effect in applications with complex observation processes, such as health care, climate science, and astronomy, to name a few. Because of the asynchronous nature, they pose a significant challenge to deep learning architectures, which presume that the time series presented to them are regularly sampled, fully observed, and aligned with respect to time. This paper proposes a novel framework, that we call Deep Convolutional Set Functions (DCSF), which is highly scalable and memory efficient, for the asynchronous time series classification task. With the recent advancements in deep set learning architectures, we introduce a model that is invariant to the order in which time series' channels are presented to it. We explore convolutional neural networks, which are well researched for the closely related problem-classification of regularly sampled and fully observed time series, for encoding the set elements. We evaluate DCSF for AsTS classification, and online (per time point) AsTS classification. Our extensive experiments on multiple real-world and synthetic datasets verify that the suggested model performs substantially better than a range of state-of-the-art models in terms of accuracy and run time.
翻译:连续时间序列是一个多变的时间序列, 所有频道都以不同步的方式独立观察, 使得时间序列在调整时极为稀少。 我们经常在健康、气候科学和天文学等复杂观察过程的应用中看到这种效果, 仅举几个例子。 由于不同步的性质, 它们给深层学习结构提出了重大挑战, 深层学习结构假定提供给它们的时间序列是定期抽样、 完全观察和与时间一致的。 本文提出了一个新的框架, 我们称之为深革命设置功能( DCSF), 它非常可缩放, 且记忆效率很高, 用于不同步的时间序列分类任务。 随着近期在深层的学习结构中的进展, 我们引入了一种模式, 与向它展示时间序列的渠道的顺序不相适应。 我们探索了革命神经网络, 研究这些网络是为了对定期抽样和完全观察的时间序列进行密切相关的问题分类, 用于对设定的元素进行编码。 我们评估了 DCSF 用于 ASTS 分类, 并在线 的多重时间序列模型, 以及 系统 模拟 模拟 和 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 和 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 和 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 和 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 和 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 和 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟 模拟