Electroencephalography (EEG) is shown to be a valuable data source for evaluating subjects' mental states. However, the interpretation of multi-modal EEG signals is challenging, as they suffer from poor signal-to-noise-ratio, are highly subject-dependent, and are bound to the equipment and experimental setup used, (i.e. domain). This leads to machine learning models often suffer from poor generalization ability, where they perform significantly worse on real-world data than on the exploited training data. Recent research heavily focuses on cross-subject and cross-session transfer learning frameworks to reduce domain calibration efforts for EEG signals. We argue that multi-source learning via learning domain-invariant representations from multiple data-sources is a viable alternative, as the available data from different EEG data-source domains (e.g., subjects, sessions, experimental setups) grow massively. We propose an adversarial inference approach to learn data-source invariant representations in this context, enabling multi-source learning for EEG-based brain-computer interfaces. We unify EEG recordings from different source domains (i.e., emotion recognition datasets SEED, SEED-IV, DEAP, DREAMER), and demonstrate the feasibility of our invariant representation learning approach in suppressing data-source-relevant information leakage by 35% while still achieving stable EEG-based emotion classification performance.
翻译:电电子学(EEG)被证明是评估对象精神状态的宝贵数据源。然而,多式电子学信号的解释具有挑战性,因为多式电子学信号的信号向神经学信号差,高度依赖主题,并且与所使用的设备和实验设置(例如领域)紧密相连。这导致机器学习模型往往缺乏概括性能力,在现实世界数据上的表现比被利用的培训数据要差得多。最近的研究主要侧重于跨科目和跨系传输学习框架,以减少电子学信号的域校准工作。我们认为,通过学习多数据源的域异性表征,多源学习是一个可行的选择,因为来自不同EEEG数据源数据源领域(例如主题、课程、实验设置)的现有数据大量增长。我们提出了一种对抗性推论,以学习数据源的异性表达方式,使基于EEG的脑计算机界面能够进行多源学习。我们把EEG从不同来源领域(i.A-DE)的域域内差异性表达式演示,同时通过SEEA-EO-EV-S-SEBS-S-SEB-S-SEVA-S-SEVAL-SAL-SAL-S-SAL-SAL-SAL-S-SOLVID-SAL-SLIAL-S-S-S-ED-S-S-S-S-S-S-S-S-SLID-S-S-S-S-S-SLID-SLI ASVI-SLID-SD-SD-SLVL ASVL ASVID-S-S-S-S-SD-SD-SYD-S-S-S-S-SYD-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-SL-SL-SL-SD-SL-SAL-SL-SL-SL-SL-SL-S-S-S-S-S-SL-SL-S-S-S-S-S-S-S-S-S-S-S-SL-SL-S-S-S