Brain-computer interfaces (BCIs) are one of the few alternatives to enable locked-in syndrome (LIS) patients to communicate with the external world, while they are the only solution for complete locked-in syndrome (CLIS) patients, who lost the ability to control eye movements. However, successful usage of endogenous electroencephalogram(EEG)-based BCI applications is often not trivial, due to EEG variations between and within sessions and long user training required. In this work we suggest an approach to deal with this two main limitations of EEG-BCIs by inserting a progressive and expandable neurofeedback training program, able to continuously tailor the classifier to the specific user, into a multimodal BCI paradigm. We propose indeed the integration of EEG with a non-brain signal: the pupillary accommodative response (PAR). The PAR is a change in pupil size associated with gaze shifts from far to close targets; it is not governed by the somatic nervous system and is thus potentially preserved after the evolution from LIS to CLIS, which often occurs in neurodegenerative diseases, such as amyotrophic lateral sclerosis. Multimodal BCIs have been broadly investigated in literature, due to their ability to yield better overall control performances, but this would be the first attempt combining EEG and PAR. In the context of the BciPar4Sla, we are exploiting these two signals, with the aim of developing a more reliable BCI, adaptive to the extent of evolving together with the user's ability to elicit the brain phenomena needed for optimal control, and providing support even in the transition from LIS to CLIS.
翻译:暂无翻译