While tremendous advances in visual and auditory realism have been made for virtual and augmented reality (VR/AR), introducing a plausible sense of physicality into the virtual world remains challenging. Closing the gap between real-world physicality and immersive virtual experience requires a closed interaction loop: applying user-exerted physical forces to the virtual environment and generating haptic sensations back to the users. However, existing VR/AR solutions either completely ignore the force inputs from the users or rely on obtrusive sensing devices that compromise user experience. By identifying users' muscle activation patterns while engaging in VR/AR, we design a learning-based neural interface for natural and intuitive force inputs. Specifically, we show that lightweight electromyography sensors, resting non-invasively on users' forearm skin, inform and establish a robust understanding of their complex hand activities. Fuelled by a neural-network-based model, our interface can decode finger-wise forces in real-time with 3.3% mean error, and generalize to new users with little calibration. Through an interactive psychophysical study, we show that human perception of virtual objects' physical properties, such as stiffness, can be significantly enhanced by our interface. We further demonstrate that our interface enables ubiquitous control via finger tapping. Ultimately, we envision our findings to push forward research towards more realistic physicality in future VR/AR.
翻译:在视觉和听觉现实方面已经取得了巨大的进步,以虚拟和增强现实(VR/AR),但在虚拟世界中引入了一种看似真实的物理感,这仍然是个挑战。缩小现实世界物理性和亲近虚拟经验之间的差距需要闭路互动循环:将用户施加的物理力量应用到虚拟环境中,并给用户带来混乱感知。然而,现有的VR/AR解决方案要么完全忽视用户的武力投入,要么依靠破坏用户经验的侵扰感测装置。我们通过参与VR/AR,为自然和直观力量投入设计了一种基于学习的神经界面。具体地说,我们展示了轻量的电子感学传感器,将非侵入的物理力量运用到用户的皮肤皮肤上,告知并建立起对其复杂手动活动的有力理解。在以神经网络为基础的模型的推动下,我们的界面可以实时解码手指力,3.3%意味着错误,并用微校准的新用户。我们通过互动的心理物理物理研究,展示了人类对虚拟物体的视觉感知力,我们通过更坚定的视觉来进一步展示我们的视觉。