Question: Can an encoder-decoder architecture pretrained on a large dataset of longitudinal electronic health records improves patient outcome predictions? Findings: In this prognostic study of 6.8 million patients, our denoising sequence-to-sequence prediction model of multiple outcomes outperformed state-of-the-art models scuh pretrained BERT on a broad range of patient outcomes, including intentional self-harm and pancreatic cancer. Meaning: Deep bidirectional and autoregressive representation improves patient outcome prediction.
翻译:问题:在大量纵向电子健康记录数据集上预先对编码器解码器结构进行了培训,能否改进患者结果预测?结果:在对680万患者进行的这一预测性研究中,我们对多种结果的分级序列到顺序预测模型的分级比最先进的模型Scuh先对BERT进行了广泛的患者结果培训,包括故意自残和胰腺癌。意思:深刻的双向和自动递减代表改善了患者结果预测。