Introduction: Sleep staging is an essential component in the diagnosis of sleep disorders and management of sleep health. It is traditionally measured in a clinical setting and requires a labor-intensive labeling process. We hypothesize that it is possible to perform robust 4-class sleep staging using the raw photoplethysmography (PPG) time series and modern advances in deep learning (DL). Methods: We used two publicly available sleep databases that included raw PPG recordings, totalling 2,374 patients and 23,055 hours. We developed SleepPPG-Net, a DL model for 4-class sleep staging from the raw PPG time series. SleepPPG-Net was trained end-to-end and consists of a residual convolutional network for automatic feature extraction and a temporal convolutional network to capture long-range contextual information. We benchmarked the performance of SleepPPG-Net against models based on the best-reported state-of-the-art (SOTA) algorithms. Results: When benchmarked on a held-out test set, SleepPPG-Net obtained a median Cohen's Kappa ($\kappa$) score of 0.75 against 0.69 for the best SOTA approach. SleepPPG-Net showed good generalization performance to an external database, obtaining a $\kappa$ score of 0.74 after transfer learning. Perspective: Overall, SleepPPG-Net provides new SOTA performance. In addition, performance is high enough to open the path to the development of wearables that meet the requirements for usage in clinical applications such as the diagnosis and monitoring of obstructive sleep apnea.
翻译:入门:入睡是诊断睡眠失常和管理睡眠健康的一个基本组成部分。入睡是传统上在临床环境中测量的,需要劳动密集型标签程序。我们假设有可能使用原始光谱采集(PPG)时间序列和现代深层学习进步(DL)来进行稳健的四级睡眠积累。方法:我们使用两个公开的睡眠数据库,其中包括原始的PPG记录,共2,374名病人和23,055小时。我们开发了睡眠PG-Net,这是从生PPG时间序列中四级睡眠积累的一种DL模型。DHEPPG-Net经过培训的终端到终端,包括一个用于自动地物提取的残余革命性能网络,以及一个用于获取远程背景信息的时序变网络。我们用两个基于最佳报告状态(SOTA)算法的模型来衡量了SleepPG-Net的性能。结果:当以露天测试集为基准时,Sleep PG-Net满足了一个中位的Kappa $(Kapa$) 的终极端端路路路路路路路路路路路路,从0.0.7G=0.69学习的SOPDAxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx