The sophisticated sense of touch of the human hand significantly contributes to our ability to safely, efficiently, and dexterously manipulate arbitrary objects in our environment. Robotic and prosthetic devices lack refined, tactile feedback from their end-effectors, leading to counterintuitive and complex control strategies. To address this lack, tactile sensors have been designed and developed, but they often offer an insufficient spatial and temporal resolution. This paper focuses on overcoming these issues by designing a smart embedded system, called SmartHand, enabling the acquisition and real-time processing of high-resolution tactile information from a hand-shaped multi-sensor array for prosthetic and robotic applications. We acquire a new tactile dataset consisting of 340,000 frames while interacting with 16 everyday objects and the empty hand, i.e., a total of 17 classes. The design of the embedded system minimizes response latency in classification, by deploying a small yet accurate convolutional neural network on a high-performance ARM Cortex-M7 microcontroller. Compared to related work, our model requires one order of magnitude less memory and 15.6x fewer computations while achieving similar inter-session accuracy and up to 98.86% and 99.83% top-1 and top-3 cross-validation accuracy, respectively. Experimental results show a total power consumption of 505mW and a latency of only 100ms.
翻译:人类手的精密触摸感能极大地促进了我们安全、高效和巧妙地在环境中操作任意物体的能力。机器人和假肢装置缺乏精细的、触觉性的来自终端效应的反馈,导致反直觉和复杂的控制策略。为解决这一缺陷,设计并开发了触摸感应器,但它们往往提供的空间和时间分辨率不足。本文件侧重于通过设计一个智能嵌入系统(称为SmartHand)来克服这些问题,使我们能够从一个手形多传感器阵列获取和实时处理高分辨率触动信息,用于修复和机器人应用。我们获得了一个由340 000个框架组成的新的触摸感应数据集,同时与16个日常物体和空手(即总共17个班级)进行互动。嵌入系统的设计最大限度地减少了分类中的耐应力,为此在高性能的 ARM Cortex-M7 微控制器上安装了一个小型但准确度精度的神经网络。与相关工作相比,我们的模型只需要一个级级级级级的多传感器阵列(98.86和15.6x顶级)的精确度和低级计算结果。