Surround-view cameras are a primary sensor for automated driving, used for near field perception. It is one of the most commonly used sensors in commercial vehicles. Four fisheye cameras with a 190{\deg} field of view cover the 360{\deg} around the vehicle. Due to its high radial distortion, the standard algorithms do not extend easily. Previously, we released the first public fisheye surround-view dataset named WoodScape. In this work, we release a synthetic version of the surround-view dataset, covering many of its weaknesses and extending it. Firstly, it is not possible to obtain ground truth for pixel-wise optical flow and depth. Secondly, WoodScape did not have all four cameras simultaneously in order to sample diverse frames. However, this means that multi-camera algorithms cannot be designed, which is enabled in the new dataset. We implemented surround-view fisheye geometric projections in CARLA Simulator matching WoodScape's configuration and created SynWoodScape. We release 80k images from the synthetic dataset with annotations for 10+ tasks. We also release the baseline code and supporting scripts.
翻译:闭路摄影机是自动驾驶的主要传感器,用于近场感知。 它是商用车辆中最常用的传感器之一。 四台有190~deg}视野的鱼眼照相机覆盖了车辆周围的360~deg}。 由于标准算法的扭曲性很高, 标准算法并不容易扩展。 我们之前发布了第一个名为 WoodScape 的公共鱼眼环景数据集。 在这项工作中, 我们发布了一个环景数据集的合成版本, 覆盖了它的许多弱点, 并扩展了它。 首先, 我们无法为像素一样的光学流和深度获得地面真实性。 其次, WoodScape没有同时拥有所有四台摄影机来同时取样不同的框架。 但是, 这表示无法设计多相机算法, 而这可以在新的数据集中启用 。 我们在 CARLA 中实施了环观鱼眼的几何测预测, 我们根据WoodScape的配置和创建了SynWoodScapee。 我们从合成数据集中释放了80k 图像并附有10+任务说明。 我们还发布了基线代码和支持脚本。