This paper presents an accurate and robust embedded motor-imagery brain-computer interface (MI-BCI). The proposed novel model, based on EEGNet, matches the requirements of memory footprint and computational resources of low-power microcontroller units (MCUs), such as the ARM Cortex-M family. Furthermore, the paper presents a set of methods, including temporal downsampling, channel selection, and narrowing of the classification window, to further scale down the model to relax memory requirements with negligible accuracy degradation. Experimental results on the Physionet EEG Motor Movement/Imagery Dataset show that standard EEGNet achieves 82.43%, 75.07%, and 65.07% classification accuracy on 2-, 3-, and 4-class MI tasks in global validation, outperforming the state-of-the-art (SoA) convolutional neural network (CNN) by 2.05%, 5.25%, and 5.48%. Our novel method further scales down the standard EEGNet at a negligible accuracy loss of 0.31% with 7.6x memory footprint reduction and a small accuracy loss of 2.51% with 15x reduction. The scaled models are deployed on a commercial Cortex-M4F MCU taking 101ms and consuming 4.28mJ per inference for operating the smallest model, and on a Cortex-M7 with 44ms and 18.1mJ per inference for the medium-sized model, enabling a fully autonomous, wearable, and accurate low-power BCI.
翻译:本文介绍了一套方法,包括时间下取样、频道选择和缩小分类窗口,以进一步缩小模型规模,放松记忆要求,降低精确度的下降幅度可忽略不计。 Physionet EEG机动移动/想象数据集的实验结果显示,标准 EEGNet在全球验证中达到82.43%、75.07%和65.07%的精确度,2-、3和4级微控制器(MCUS)的存储足迹和计算资源要求,如ARM Cortex-M家族。此外,本文件还介绍了一套方法,包括时间下取样、频道选择和缩小分类窗口,以进一步缩小模型规模,以降低记忆要求,而降低精确度可忽略不计。