Human sleep is cyclical with a period of approximately 90 minutes, implying long temporal dependency in the sleep data. Yet, exploring this long-term dependency when developing sleep staging models has remained untouched. In this work, we show that while encoding the logic of a whole sleep cycle is crucial to improve sleep staging performance, the sequential modelling approach in existing state-of-the-art deep learning models are inefficient for that purpose. We thus introduce a method for efficient long sequence modelling and propose a new deep learning model, L-SeqSleepNet, which takes into account whole-cycle sleep information for sleep staging. Evaluating L-SeqSleepNet on four distinct databases of various sizes, we demonstrate state-of-the-art performance obtained by the model over three different EEG setups, including scalp EEG in conventional Polysomnography (PSG), in-ear EEG, and around-the-ear EEG (cEEGrid), even with a single EEG channel input. Our analyses also show that L-SeqSleepNet is able to alleviate the predominance of N2 sleep (the major class in terms of classification) to bring down errors in other sleep stages. Moreover the network becomes much more robust, meaning that for all subjects where the baseline method had exceptionally poor performance, their performance are improved significantly. Finally, the computation time only grows at a sub-linear rate when the sequence length increases.
翻译:人类睡眠周期性周期性,大约90分钟,意味着睡眠数据中长期依赖性。然而,在开发睡眠变迁模型时探索这种长期依赖性,一直没有受到影响。在这项工作中,我们表明,虽然将整个睡眠周期的逻辑编码对于提高睡眠变迁性表现至关重要,但现有最先进的深层学习模型中的顺序建模方法在这方面效率低下。因此,我们引入了高效长序建模方法,并提出了一个新的深层次学习模型L-SeqSleepNet,该模型将整个周期的睡眠信息考虑在内,用于睡眠变迁。在四个不同大小的不同数据库中评估L-SeqSleepNet,我们展示了该模型在三个不同的EEEG设置中所获得的最新业绩,包括传统多功能合成模型中的EEEG(PSG),在近距离EEG,和近距离EEEG(cEGrid),即使有单一的EEEEG频道输入。我们的分析还表明,L-SeqSlistNet能够减轻N2睡眠的主导度(主要分类类),在三个不同大小的数据库中,我们展示最先进的模型最先进的性性性表现,最终意味着在其它的周期里程中,其精确的计算方法上都变得非常精确。