The change point is a moment of an abrupt alteration in the data distribution. Current methods for change point detection are based on recurrent neural methods suitable for sequential data. However, recent works show that transformers based on attention mechanisms perform better than standard recurrent models for many tasks. The most benefit is noticeable in the case of longer sequences. In this paper, we investigate different attentions for the change point detection task and proposed specific form of attention related to the task at hand. We show that using a special form of attention outperforms state-of-the-art results.
翻译:改变点是数据分布突变的时刻。 目前的改变点检测方法基于适合相继数据的经常性神经方法。 但是,最近的工程显示,基于关注机制的变压器在许多任务中比标准的重复模式效果更好。 最明显的是较长的序列。 在本文中, 我们调查改变点检测任务的不同关注点, 并提出与当前任务相关的特定关注形式。 我们显示, 使用特殊形式的关注方式比最新的结果要好。