Event-based sensors have recently drawn increasing interest in robotic perception due to their lower latency, higher dynamic range, and lower bandwidth requirements compared to standard CMOS-based imagers. These properties make them ideal tools for real-time perception tasks in highly dynamic environments. In this work, we demonstrate an application where event cameras excel: accurately estimating the impact location of fast-moving objects. We introduce a lightweight event representation called Binary Event History Image (BEHI) to encode event data at low latency, as well as a learning-based approach that allows real-time inference of a confidence-enabled control signal to the robot. To validate our approach, we present an experimental catching system in which we catch fast-flying ping-pong balls. We show that the system is capable of achieving a success rate of 81% in catching balls targeted at different locations, with a velocity of up to 13 m/s even on compute-constrained embedded platforms such as the Nvidia Jetson NX.
翻译:事件驱动传感器近年来因其较低的延迟、更高的动态范围和较低的带宽要求而备受机器人感知领域的关注。这些性质使得它们成为高度动态环境下实时感知任务的理想工具。在本文中,我们展示了一种事件摄像头优越性的应用:准确估计快速移动物体的撞击位置。我们引入了一种轻量级的事件表示方法,称为二进制事件历史图像(BEHI),以在低延迟下编码事件数据,以及一种基于学习的方法,允许机器人实时推断出一个置信度的控制信号。为了验证我们的方法,我们展示了一个实验性的物体抓取系统,该系统可以捕捉快速飞行的乒乓球。我们展示了该系统能够在嵌入式平台(如Nvidia Jetson NX)上以13 m/s的速度成功抓住不同位置的乒乓球,成功率达81%。