Constructing colorized point clouds from mobile laser scanning and images is a fundamental work in surveying and mapping. It is also an essential prerequisite for building digital twins for smart cities. However, existing public datasets are either in relatively small scales or lack accurate geometrical and color ground truth. This paper documents a multisensorial dataset named PolyU-BPCoMA which is distinctively positioned towards mobile colorized mapping. The dataset incorporates resources of 3D LiDAR, spherical imaging, GNSS and IMU on a backpack platform. Color checker boards are pasted in each surveyed area as targets and ground truth data are collected by an advanced terrestrial laser scanner (TLS). 3D geometrical and color information can be recovered in the colorized point clouds produced by the backpack system and the TLS, respectively. Accordingly, we provide an opportunity to benchmark the mapping and colorization accuracy simultaneously for a mobile multisensorial system. The dataset is approximately 800 GB in size covering both indoor and outdoor environments. The dataset and development kits are available at https://github.com/chenpengxin/PolyU-BPCoMa.git.
翻译:通过移动激光扫描和图像构造彩色点云是测量和绘图的一项基本工作,也是为智能城市建立数字双胞胎的基本先决条件。然而,现有的公共数据集不是规模相对较小,就是缺乏精确的几何和颜色地面真相。本文记载了名为PolyU-BPCoMA的多感知数据集,该数据集与移动彩色绘图有明显区别。数据集包含3D LiDAR、球形成像、全球导航卫星系统和IMU在背包平台上的资源。每个被调查区域都贴贴了彩色检查板,因为目标由先进的地面激光扫描仪收集,地面真象数据由地面激光扫描仪收集。3D几何和颜色信息可以在背负系统和TLS分别生成的彩色点云中恢复。因此,我们提供了一个机会,为移动多色扫描系统同时设定绘图和颜色准确度基准。数据集在室内和室外环境的大小约为800GB。数据集和开发工具包可在https://github.com/chengxin/PolyU-BPOMa.giat上查阅。