Physical human-robot interactions (pHRI) are less efficient and communicative than human-human interactions, and a key reason is a lack of informative sense of touch in robotic systems. Interpreting human touch gestures is a nuanced, challenging task with extreme gaps between human and robot capability. Among prior works that demonstrate human touch recognition capability, differences in sensors, gesture classes, feature sets, and classification algorithms yield a conglomerate of non-transferable results and a glaring lack of a standard. To address this gap, this work presents 1) four proposed touch gesture classes that cover an important subset of the gesture characteristics identified in the literature, 2) the collection of an extensive force dataset on a common pHRI robotic arm with only its internal wrist force-torque sensor, and 3) an exhaustive performance comparison of combinations of feature sets and classification algorithms on this dataset. We demonstrate high classification accuracies among our proposed gesture definitions on a test set, emphasizing that neural net-work classifiers on the raw data outperform other combinations of feature sets and algorithms. The accompanying video is here: https://youtu.be/gJPVImNKU68
翻译:人体-机器人相互作用(pHRI)比人类-人类相互作用(pHRI)效率低、交流性强,关键的原因是机器人系统缺乏知情的触觉感知;解释人类触摸手势是一项细微的、具有挑战性的任务,人类和机器人能力之间存在极端差距;在以往显示人类触摸识别能力的作品中,传感器、手势类别、特征集和分类算法的差异产生了不可转移结果和明显缺乏标准的综合体。为弥补这一差距,这项工作提出:(1) 四个拟议的触摸手势类,涵盖文献中确认的手势特征的一个重要部分;(2) 收集一个仅具有内部手腕力感传感器的通用PHRI机械臂上的广泛力数据集;(3) 详尽地比较该数据集的地貌组合和分类算法。我们展示了在测试集上我们提议的手势定义的高等级理解度,强调原始数据的净工作分类器超越了特征组和算法的其他组合。随附的视频是:https://youtu.be/gJIMKU。