Physical human-robot interactions (pHRI) are less efficient and communicative than human-human interactions, and a key reason is a lack of informative sense of touch in robotic systems. Interpreting human touch gestures is a nuanced, challenging task with extreme gaps between human and robot capability. Among prior works that demonstrate human touch recognition capability, differences in sensors, gesture classes, feature sets, and classification algorithms yield a conglomerate of non-transferable results and a glaring lack of a standard. To address this gap, this work presents 1) four proposed touch gesture classes that cover the majority of the gesture characteristics identified in the literature, 2) the collection of an extensive force dataset on a common pHRI robotic arm with only its internal wrist force-torque sensor, and 3) an exhaustive performance comparison of combinations of feature sets and classification algorithms on this dataset. We demonstrate high classification accuracies among our proposed gesture definitions on a test set, emphasizing that neural network classifiers on the raw data outperform other combinations of feature sets and algorithms.
翻译:人体-机器人相互作用(pHRI)比人类-人类相互作用(pHRI)效率低、交流性强,关键的原因是机器人系统缺乏知情的触摸感;解释人类触摸手势是一项细微的、具有挑战性的任务,人类和机器人能力之间有着极大的差距;在以往显示人类触摸识别能力的作品中,传感器、手势等级、特征集和分类算法的差异产生了不可转移结果和明显缺乏标准的综合体。为弥补这一差距,这项工作提出了4个拟议的触摸手势类,涵盖文献中确认的大多数手势特征;2 收集一个仅具有内部手腕力控传感器的通用PHRI机器人臂的广泛力数据集;3 详尽地比较该数据集的特征集组合和分类算法。我们在测试集的拟议手势定义中表现出高度的分类精度,强调神经网络分类器在原始数据超越特征组和算法的其他组合。