Covariance matrices of noisy multichannel electroencephalogram time series data are hard to estimate due to high dimensionality. In brain-computer interfaces (BCI) based on event-related potentials and a linear discriminant analysis (LDA) for classification, the state of the art to address this problem is by shrinkage regularization. We propose a novel idea to tackle this problem by enforcing a block-Toeplitz structure for the covariance matrix of the LDA, which implements an assumption of signal stationarity in short time windows for each channel. On data of 213 subjects collected under 13 event-related potential BCI protocols, the resulting 'ToeplitzLDA' significantly increases the binary classification performance compared to shrinkage regularized LDA (up to 6 AUC points) and Riemannian classification approaches (up to 2 AUC points). This translates to greatly improved application level performances, as exemplified on data recorded during an unsupervised visual speller application, where spelling errors could be reduced by 81% on average for 25 subjects. Aside from lower memory and time complexity for LDA training, ToeplitzLDA proved to be almost invariant even to a twenty-fold time dimensionality enlargement, which reduces the need of expert knowledge regarding feature extraction.
翻译:由于高度的维度,很难估计吵闹的多通道电子脑图时间序列数据的共变矩阵。在基于事件相关潜力和用于分类的线性差异分析(LDA)的大脑-计算机界面(BCI)中,解决这一问题的最先进的办法是缩小常规化。我们提出一种新的想法,通过对LDA的共变矩阵实施块状托里茨结构来解决这一问题,LDA的共变矩阵对每个频道的短时窗口中都有一个信号定置假设。根据13个与事件相关的潜在BCI协议收集的213个主题的数据,由此产生的“TOepliitzLDA”大大提高了二进制分类性,而LDA的常规化(达6个AUC点)和Riemannian分类方法(达2个AUC点)已经收缩。这相当于大大改进了应用水平的性能,例如,在未经监督的视觉拼写器应用中记录的数据,其中25个主题的拼写误差平均减少81%。此外,LDA在LDA培训方面,从记忆和时间复杂性较低,几乎可以降低专家的二进度,DDA的扩展特性的扩展到二进化特性,这需要经的二进化的二进到二进化到二进化的深度化。