The application of Transformer neural networks to Electronic Health Records (EHR) is challenging due to the distinct, multidimensional sequential structure of EHR data, often leading to underperformance when compared to simpler linear models. Thus, the advantages of Transformers, such as efficient transfer learning and improved scalability are not fully exploited in EHR applications. To overcome these challenges, we introduce SANSformer, a novel attention-free sequential model designed specifically with inductive biases to cater for the unique characteristics of EHR data. Our main application area is predicting future healthcare utilization, a crucial task for effectively allocating healthcare resources. This task becomes particularly difficult when dealing with divergent patient subgroups. These subgroups, characterized by unique health trajectories and often small in size, such as patients with rare diseases, require specialized modeling approaches. To address this, we adopt a self-supervised pretraining strategy, which we term Generative Summary Pretraining (GSP). GSP predicts summary statistics of a future window in the patient's history based on their past health records, thus demonstrating potential to deal with the noisy and complex nature of EHR data. We pretrain our models on a comprehensive health registry encompassing close to one million patients, before fine-tuning them for specific subgroup prediction tasks. In our evaluations, SANSformer consistently outshines strong EHR baselines. Importantly, our GSP pretraining method greatly enhances model performance, especially for smaller patient subgroups. Our findings underscore the substantial potential of bespoke attention-free models and self-supervised pretraining for enhancing healthcare utilization predictions across a broad range of patient groups.
翻译:暂无翻译