Prosody plays a vital role in verbal communication. Acoustic cues of prosody have been examined extensively. However, prosodic characteristics are not only perceived auditorily, but also visually based on head and facial movements. The purpose of this report is to present a method for examining audiovisual prosody using virtual reality. We show that animations based on a virtual human provide motion cues similar to those obtained from video recordings of a real talker. The use of virtual reality opens up new avenues for examining multimodal effects of verbal communication. We discuss the method in the framework of examining prosody perception in cochlear implant listeners.
翻译:在口头交流中,Prosody起着关键作用。对手淫的声波信号进行了广泛的研究。然而,对预言特征的认知不仅有听觉性,而且根据头部和面部的动向进行视觉分析。本报告的目的是提出一种使用虚拟现实来检查视听动作的方法。我们显示,基于虚拟人的动画提供了类似于真实谈话者录相的动画提示。虚拟现实的使用为审查口头交流的多式效果开辟了新的途径。我们讨论了在科切耳耳植入式听筒中检查动听觉的方法。