Collaborative robotic industrial cells are workspaces where robots collaborate with human operators. In this context, safety is paramount, and for that a complete perception of the space where the collaborative robot is inserted is necessary. To ensure this, collaborative cells are equipped with a large set of sensors of multiple modalities, covering the entire work volume. However, the fusion of information from all these sensors requires an accurate extrinsic calibration. The calibration of such complex systems is challenging, due to the number of sensors and modalities, and also due to the small overlapping fields of view between the sensors, which are positioned to capture different viewpoints of the cell. This paper proposes a sensor to pattern methodology that can calibrate a complex system such as a collaborative cell in a single optimization procedure. Our methodology can tackle RGB and Depth cameras, as well as LiDARs. Results show that our methodology is able to accurately calibrate a collaborative cell containing three RGB cameras, a depth camera and three 3D LiDARs.
翻译:合作机器人工业电池是机器人与人类操作者合作的工作空间。 在这方面,安全是至高无上的关键,因此,对合作机器人插入空间的完整认识是必要的。为了确保这一点,协作细胞配备了一套包含整个工作量的多种模式的大型传感器。然而,从所有这些传感器汇集信息需要准确的外部校准。由于传感器和模式的数量,以及由于传感器之间观测范围小而重叠,这种复杂系统的校准具有挑战性,因为传感器可以捕捉细胞的不同观点。本文建议一种传感器,以图案法来校准一个复杂的系统,如单一优化程序的协作细胞。我们的方法可以处理RGB和深度摄像头以及LIDARs。结果显示,我们的方法能够精确校准一个包含三台RGB摄像机、一台深度摄像头和三台3DLDARs的协作单元。