This research identifies a gap in weakly-labelled multivariate time-series classification (TSC), where state-of-the-art TSC models do not per-form well. Weakly labelled time-series are time-series containing noise and significant redundancies. In response to this gap, this paper proposes an approach of exploiting context relevance of subsequences from previous subsequences to improve classification accuracy. To achieve this, state-of-the-art Attention algorithms are experimented in combination with the top CNN models for TSC (FCN and ResNet), in an CNN-LSTM architecture. Attention is a popular strategy for context extraction with exceptional performance in modern sequence-to-sequence tasks. This paper shows how attention algorithms can be used for improved weakly labelledTSC by evaluating models on a multivariate EEG time-series dataset obtained using a commercial Emotiv headsets from participants performing various activities while driving. These time-series are segmented into sub-sequences and labelled to allow supervised TSC.
翻译:这项研究确定了标签不全的多变时间序列分类(TSC)中存在的差距,即最先进的TSC模型并非一成不变。标记不全的时间序列是含有噪音和重大冗余的时间序列。针对这一差距,本文件建议采用一种办法,利用前一个子序列的子序列的子序列的上下文关联性来提高分类的准确性。为了实现这一目的,在CNN-LSTM的架构中,将最先进的CNN TSC模型(FCN和ResNet)结合试验了最先进的关注算法。注意是一种在现代顺序到顺序任务中具有特殊性能的背景提取的流行战略。本文说明了如何通过评价多变式 EEG时间序列数据集模型,利用在驾驶期间从事各种活动的参与者的商业耳目来评估该模型,将注意力算法用于改进微弱的标记TSC。这些时间序列分为子序列,并贴上标签,允许受监督的TSC。