Human-robot interactions are less efficient and communicative than human-to-human interactions, and a key reason is a lack of informed sense of touch in robotic systems. Existing literature demonstrates robot success in executing handovers with humans, albeit with substantial reliance on external sensing or with primitive signal processing methods, deficient compared to the rich set of information humans can detect. In contrast, we present models capable of distinguishing between four classes of human tactile gestures at a robot's end effector, using only a non-collocated six-axis force sensor at the wrist. Due to the absence in the literature, this work describes 1) the collection of an extensive force dataset characterized by human-robot contact events, and 2) classification models informed by this dataset to determine the nature of the interaction. We demonstrate high classification accuracies among our proposed gesture definitions on a test set, emphasizing that neural network classifiers on the raw data outperform several other combinations of algorithms and feature sets.
翻译:人类机器人的相互作用比人与人之间的相互作用效率低、交流性低,关键的原因是机器人系统缺乏知情的接触感。现有文献表明机器人在与人类交接方面取得成功,尽管在很大程度上依赖外部遥感或原始信号处理方法,但与人类能够检测的丰富信息组相比,还存在缺陷。相比之下,我们展示了能够区分机器人终端效果器四类人类触动手势的模型,仅使用手腕上非对接的六轴动感应器。由于缺少文献,这项工作描述了:(1) 收集以人与机器人接触事件为特征的广泛的力量数据集,以及(2) 利用该数据集获得的分类模型来确定互动的性质。我们展示了在测试集上拟议手势定义中的高度分类,强调原始数据上的神经网络分类器超越了其他几种算法和特征组合。