In this study, an adaptive object deformability-agnostic human-robot collaborative transportation framework is presented. The proposed framework enables to combine the haptic information transferred through the object with the human kinematic information obtained from a motion capture system to generate reactive whole-body motions on a mobile collaborative robot. Furthermore, it allows rotating the objects in an intuitive and accurate way during co-transportation based on an algorithm that detects the human rotation intention using the torso and hand movements. First, we validate the framework with the two extremities of the object deformability range (i.e, purely rigid aluminum rod and highly deformable rope) by utilizing a mobile manipulator which consists of an Omni-directional mobile base and a collaborative robotic arm. Next, its performance is compared with an admittance controller during a co-carry task of a partially deformable object in a 12-subjects user study. Quantitative and qualitative results of this experiment show that the proposed framework can effectively handle the transportation of objects regardless of their deformability and provides intuitive assistance to human partners. Finally, we have demonstrated the potential of our framework in a different scenario, where the human and the robot co-transport a manikin using a deformable sheet.
翻译:在本研究中,介绍了一个适应性物体变形-不可知人的机器人合作运输框架。拟议框架能够将通过物体转移的顺质信息与从运动捕捉系统获得的人体运动感官信息结合起来,在移动协作机器人上产生反应性的全体动作。此外,它允许根据一种算法,在共同运输过程中以直觉和准确的方式旋转物体,该算法可以检测人的旋转意图,使用感应器和手动。首先,我们通过利用由全向移动基和一个协作机器人臂组成的移动操纵器,将通过物体变形范围(即纯硬性铝棒和高度变形绳)的两端框架与人类运动信息结合起来。其性能与12个主题用户研究中部分变形物体的共载控制器相比。这一试验的定量和定性结果表明,拟议框架能够有效地处理物体变形范围的运输,为人类伙伴提供直性协助。最后,我们用一个不同的机器人模型展示了我们框架的可变形和可变形模型。