In this paper we present our hardware design and control approaches for a mobile manipulation platform used in Challenge 2 of the MBZIRC 2020 competition. In this challenge, a team of UAVs and a single UGV collaborate in an autonomous, wall-building scenario, motivated by construction automation and large-scale robotic 3D printing. The robots must be able, autonomously, to detect, manipulate, and transport bricks in an unstructured, outdoor environment. Our control approach is based on a state machine that dictates which controllers are active at each stage of the Challenge. In the first stage our UGV uses visual servoing and local controllers to approach the target object without considering its orientation. The second stage consists of detecting the object's global pose using OpenCV-based processing of RGB-D image and point-cloud data, and calculating an alignment goal within a global map. The map is built with Google Cartographer and is based on onboard LIDAR, IMU, and GPS data. Motion control in the second stage is realized using the ROS Move Base package with Time-Elastic Band trajectory optimization. Visual servo algorithms guide the vehicle in local object-approach movement and the arm in manipulating bricks. To ensure a stable grasp of the brick's magnetic patch, we developed a passively-compliant, electromagnetic gripper with tactile feedback. Our fully-autonomous UGV performed well in Challenge 2 and in post-competition evaluations of its brick pick-and-place algorithms.
翻译:在本文中,我们展示了用于MBZIRC 2020 竞争挑战2中使用的移动操纵平台的硬件设计和控制方法。 在这项挑战中,由无人机和单一UGV组成的团队在建筑自动化和大规模机器人3D打印驱动的自主、壁建情景下协作。机器人必须能够自主地在一个没有结构的户外环境中检测、操作和运输砖块。我们的控制方法基于一台国家机器,该机器要求控制器在挑战的每个阶段都活跃在控制器上。在第一阶段,我们的UGV使用视觉智能和本地控制器在不考虑目标选择方向的情况下接近目标心脏对象。第二阶段是利用基于建筑自动化的OpenCV处理 RGB-D 图像和点球形数据来检测物体的全球性构成,并在全球地图中计算一个校正目标。地图是用谷歌制图仪建在LIDAR、IMU和GPS数据上。在第二个阶段,我们使用ROS移动目标软件包,同时使用时间-Elactal-ral-ral-ral-ral-ral-ral-hassaltraction 车辆的完整方向优化中,确保稳定的磁轴动。