This article proposes a novel framework that utilizes an over-the-air Brain-Computer Interface (BCI) to learn Metaverse users' expectations. By interpreting users' brain activities, our framework can optimize physical resources and enhance Quality-of-Experience (QoE) for users. To achieve this, we leverage a Wireless Edge Server (WES) to process electroencephalography (EEG) signals via uplink wireless channels, thus eliminating the computational burden for Metaverse users' devices. As a result, the WES can learn human behaviors, adapt system configurations, and allocate radio resources to tailor personalized user settings. Despite the potential of BCI, the inherent noisy wireless channels and uncertainty of the EEG signals make the related resource allocation and learning problems especially challenging. We formulate the joint learning and resource allocation problem as a mixed integer programming problem. Our solution involves two algorithms: a hybrid learning algorithm and a meta-learning algorithm. The hybrid learning algorithm can effectively find the solution for the formulated problem. Specifically, the meta-learning algorithm can further exploit the neurodiversity of the EEG signals across multiple users, leading to higher classification accuracy. Extensive simulation results with real-world BCI datasets show the effectiveness of our framework with low latency and high EEG signal classification accuracy.
翻译:暂无翻译