Deep neural networks (DNN) have become increasingly utilized in brain-computer interface (BCI) technologies with the outset goal of classifying human physiological signals in computer-readable format. While our present understanding of DNN usage for BCI is promising, we have little experience in deciphering neural events from dynamic freely-mobile situations. Using an improved version of EEGNet, our goal was to classify cognitive events from electroencephalography (EEG) signals while subjects simultaneously walked on a treadmill, sometimes while carrying a rucksack equivalent to 40% of their body weight. Walking subjects simultaneously performed a visual oddball target detection task, eliciting the P300 event-related potential (ERP), which then served as the DNN classification target. We found the base EEGNet to reach classification levels well above chance, with similar performance to previously reported P300 results. We found performance to be robust to noise, with classification similar for walking and loaded walking, with respect to standard seated condition with minimal movement. With additional architectural search and tuning to the EEGNet model (termed Cog-Neuro, herein; CN-EEGNet), we reached classification accuracy of greater than 95%, similar to previously reported state of the art levels in seated P300 tasks. To our knowledge, these results are the first documented implementation of a DNN for the classification of cognitive neural state during dual-task walking. The classification of one's ongoing cognitive state during a demanding physical task establishes the utility for BCI in complex environments.
翻译:暂无翻译