During collaboration in XR (eXtended Reality), users typically share and interact with virtual objects in a common, shared virtual environment. Specifically, collaboration among users in Mixed Reality (MR) requires knowing their position, movement, and understanding of the visual scene surrounding their physical environments. Otherwise, one user could move an important virtual object to a position blocked by the physical environment for others. However, even for a single physical environment, 3D reconstruction takes a long time and the produced 3D data is typically very large in size. Also, these large amounts of 3D data take a long time to be streamed to receivers making real-time updates on the rendered scene challenging. Furthermore, many collaboration systems in MR require multiple devices, which take up space and make setup difficult. To address these challenges, in this paper, we describe a single-device system called Collaborative Adaptive Mixed Reality Environment (CAMRE). We build CAMRE using the scene understanding capabilities of HoloLens 2 devices to create shared MR virtual environments for each connected user and demonstrate using a Leader-Follower(s) paradigm: faster reconstruction and scene update times due to smaller data. Consequently, multiple users can receive shared, synchronized, and close-to-real-time latency virtual scenes from a chosen Leader, based on their physical position and movement. We also illustrate other expanded features of CAMRE MR virtual environment such as navigation using a real-time virtual mini-map and X-ray vision for handling adaptive wall opacity. We share several experimental results that evaluate the performance of CAMRE in terms of the network latency in sharing virtual objects and other capabilities.
翻译:暂无翻译