The Metaverse through VR headsets is a rapidly growing concept, but the high cost of entry currently limits access for many users. This project aims to provide an accessible entry point to the immersive Metaverse experience by leveraging web technologies. The platform developed allows users to engage with rendered avatars using only a web browser, microphone, and webcam. By employing the WebGL and MediaPipe face tracking AI model from Google, the application generates real-time 3D face meshes for users. It uses a client-to-client streaming cluster to establish a connection, and clients negotiate SRTP protocol through WebRTC for direct data streaming. Additionally, the project addresses backend challenges through an architecture that is serverless, distributive, auto-scaling, highly resilient, and secure. The platform offers a scalable, hardware-free solution for users to experience a near-immersive Metaverse, with the potential for future integration with game server clusters. This project provides an important step toward a more inclusive Metaverse accessible to a wider audience.
翻译:“元宇宙”通过VR头盔的方式是一个快速增长的概念,但是高昂的入门成本目前限制了许多用户的访问。本项目旨在利用Web技术提供一个可访问的入门点,以沉浸式的“元宇宙”体验为基础。所开发的平台允许用户仅使用Web浏览器、麦克风和网络摄像头与渲染的化身进行交互。通过采用Google的WebGL和MediaPipe面部跟踪AI模型,应用程序为用户生成实时的3D面部网格。它利用客户端到客户端的流媒体集群来建立连接,并通过WebRTC协议进行直接数据流传输以协商SRTP协议。此外,该项目通过无服务器、分布式、自动扩展、高度弹性和安全的架构来解决后台挑战。该平台提供了一个可扩展、免硬件的解决方案,让用户可以体验近乎沉浸式的“元宇宙”,并有可能与游戏服务器集群进行未来集成。本项目为迈向更加包容接纳更广泛受众的“元宇宙”迈出了重要的一步。