Event cameras are bio-inspired vision sensors that asynchronously measure per-pixel brightness changes.The high-temporal resolution and asynchronicity of event cameras offer great potential for estimating robot motion states. Recent works have adopted the continuous-time estimation methods to exploit the inherent nature of event cameras. However, existing methods either have poor runtime performance or neglect the high-temporal resolution of event cameras. To alleviate it, an Asynchronous Event-driven Visual Odometry (AsynEVO) based on sparse Gaussian Process (GP) regression is proposed to efficiently infer the motion trajectory from pure event streams. Concretely, an asynchronous frontend pipeline is designed to adapt event-driven feature tracking and manage feature trajectories; a parallel dynamic sliding-window backend is presented within the framework of sparse GP regression on $SE(3)$. Notably, a dynamic marginalization strategy is employed to ensure the consistency and sparsity of this GP regression. Experiments conducted on public datasets and real-world scenarios demonstrate that AsynEVO achieves competitive precision and superior robustness compared to the state-of-the-art.The experiment in the repeated-texture scenario indicates that the high-temporal resolution of AsynEVO plays a vital role in the estimation of high-speed movement. Furthermore, we show that the computational efficiency of AsynEVO significantly outperforms the incremental method.
翻译:暂无翻译