Low latency and accuracy are fundamental requirements when vision is integrated in robots for high-speed interaction with targets, since they affect system reliability and stability. In such a scenario, the choice of the sensor and algorithms is important for the entire control loop. The technology of event-cameras can guarantee fast visual sensing in dynamic environments, but requires a tracking algorithm that can keep up with the high data rate induced by the robot ego-motion while maintaining accuracy and robustness to distractors. In this paper, we introduce a novel tracking method that leverages the Exponential Reduced Ordinal Surface (EROS) data representation to decouple event-by-event processing and tracking computation. The latter is performed using convolution kernels to detect and follow a circular target moving on a plane. To benchmark state-of-the-art event-based tracking, we propose the task of tracking the air hockey puck sliding on a surface, with the future aim of controlling the iCub robot to reach the target precisely and on time. Experimental results demonstrate that our algorithm achieves the best compromise between low latency and tracking accuracy both when the robot is still and when moving.
翻译:低悬浮度和精确度是将视觉纳入高速度与目标互动的机器人中的基本要求,因为它们会影响系统可靠性和稳定性。在这种情况下,传感器和算法的选择对整个控制环很重要。事件相机技术可以保证动态环境中的快速视觉感测,但需要一种跟踪算法,能够跟上机器人自我感动引起的高数据率,同时保持对分散器的准确性和稳健性。在本文中,我们引入了一种新型的跟踪方法,利用实验减少半陆表面(EROS)数据代表法进行逐项事件处理和跟踪计算。后者是使用脉冲内核来检测和跟踪在飞机上移动的圆形目标。为了对以事件为基础的最先进的跟踪进行基准,我们提议了跟踪在表面滑动的空曲棍球滑动,未来的目标是控制iCub机器人准确和及时到达目标。实验结果表明,我们的算法在机器人仍在移动时,在低纬度和跟踪精确度之间实现了最佳的折中。