Deep neural networks (DNNs) have achieved state-of-the-art results on time series classification (TSC) tasks. In this work, we focus on leveraging DNNs in the often-encountered practical scenario where access to labeled training data is difficult, and where DNNs would be prone to overfitting. We leverage recent advancements in gradient-based meta-learning, and propose an approach to train a residual neural network with convolutional layers as a meta-learning agent for few-shot TSC. The network is trained on a diverse set of few-shot tasks sampled from various domains (e.g. healthcare, activity recognition, etc.) such that it can solve a target task from another domain using only a small number of training samples from the target task. Most existing meta-learning approaches are limited in practice as they assume a fixed number of target classes across tasks. We overcome this limitation in order to train a common agent across domains with each domain having different number of target classes, we utilize a triplet-loss based learning procedure that does not require any constraints to be enforced on the number of classes for the few-shot TSC tasks. To the best of our knowledge, we are the first to use meta-learning based pre-training for TSC. Our approach sets a new benchmark for few-shot TSC, outperforming several strong baselines on few-shot tasks sampled from 41 datasets in UCR TSC Archive. We observe that pre-training under the meta-learning paradigm allows the network to quickly adapt to new unseen tasks with small number of labeled instances.
翻译:深神经网络(DNNS)在时间序列分类(TSC)任务方面已经取得了最先进的成果。 在这项工作中,我们侧重于在经常出现的实际情景中利用DNNS来利用标签培训数据难以获取,DNNS容易过度适应。我们利用最近基于梯度的元学习进展,并提议了一种方法来培训具有富集层的剩余神经网络,作为微小TSC的元学习媒介。这个网络在从不同领域抽取的一组多发任务(例如保健、活动网络识别等)方面得到迅速培训,这样它就能够从另一个领域(只有少量培训样本难以获取标签培训数据)解决目标任务。大多数现有的元学习方法在实践上是有限的,因为它们跨任务有固定数量的目标班级。我们克服了这一局限性,以便在每个领域培训一个具有不同数量的目标级的通用代理机构,我们采用了基于三重损失的学习程序,这不需要在几班级(比如:医疗保健、活动网络识别等等等)中执行一个目标任务。我们从几班级的SSC标准课程中,我们从几学前的TSC基准任务中的最佳知识是TSC。