Successful motor-imagery brain-computer interface (MI-BCI) algorithms either extract a large number of handcrafted features and train a classifier, or combine feature extraction and classification within deep convolutional neural networks (CNNs). Both approaches typically result in a set of real-valued weights, that pose challenges when targeting real-time execution on tightly resource-constrained devices. We propose methods for each of these approaches that allow transforming real-valued weights to binary numbers for efficient inference. Our first method, based on sparse bipolar random projection, projects a large number of real-valued Riemannian covariance features to a binary space, where a linear SVM classifier can be learned with binary weights too. By tuning the dimension of the binary embedding, we achieve almost the same accuracy in 4-class MI ($\leq$1.27% lower) compared to models with float16 weights, yet delivering a more compact model with simpler operations to execute. Second, we propose to use memory-augmented neural networks (MANNs) for MI-BCI such that the augmented memory is binarized. Our method replaces the fully connected layer of CNNs with a binary augmented memory using bipolar random projection, or learned projection. Our experimental results on EEGNet, an already compact CNN for MI-BCI, show that it can be compressed by 1.28x at iso-accuracy using the random projection. On the other hand, using the learned projection provides 3.89% higher accuracy but increases the memory size by 28.10x.
翻译:成功的马达模拟大脑- 计算机界面( MI- BCI) 成功 的马达模拟 大脑- 计算机界面( MI- BCI) 算法, 或者提取大量手工艺特征, 训练一个分类器, 或者在深层神经神经神经网络( CNNs) 中将特征提取和分类组合。 这两种方法通常都会产生一系列真实价值的重量。 当将实时执行设定在严格资源限制的装置上时, 这两种方法都构成挑战。 我们为其中每一种方法提出了方法, 以便将实际价值的重量转换为二进制数字, 以高效推断为有效。 我们的第一种方法, 以稀少的双极随机随机投影为基础, 将大量真实价值的里曼尼共变异性功能投射到一个二进制空间, 也可以用二进制重量的双进制 SVMLM 。 通过调整二进制的直流, 我们的直流的直径直径直径直线 SMAR-, 将自动投影化的直径直径直径图,, 将自动投影射法在 IMISAL IMIS IMIS IMIS 上, IMIS II II 上, 上, 的直径直径直射中,,,, 的直径直径直径直投射为,, II 直径直径直径直径直投,, II,, 直投, II II,,, 直径直投影, II II, 直投,,,, II II,,, 直投为, II- II- II- II-, II- II-, II- II- II- II- II- II- II- II- II- II- II- II- II- II- II- II- II- II- II-, II- II- II- II- II- II- II- II- II-, II- II- II-