Event sensing is a major component in bio-inspired flight guidance and control systems. We explore the usage of event cameras for predicting time-to-contact (TTC) with the surface during ventral landing. This is achieved by estimating divergence (inverse TTC), which is the rate of radial optic flow, from the event stream generated during landing. Our core contributions are a novel contrast maximisation formulation for event-based divergence estimation, and a branch-and-bound algorithm to exactly maximise contrast and find the optimal divergence value. GPU acceleration is conducted to speed up the global algorithm. Another contribution is a new dataset containing real event streams from ventral landing that was employed to test and benchmark our method. Owing to global optimisation, our algorithm is much more capable at recovering the true divergence, compared to other heuristic divergence estimators or event-based optic flow methods. With GPU acceleration, our method also achieves competitive runtimes.
翻译:事件感测是生物激发的飞行指导和控制系统的一个主要组成部分。 我们探索使用事件相机来预测在通风着陆时与表面接触的时间(TTC),这是通过估计与着陆时产生的事件流的偏差(TTC逆向)实现的。 我们的核心贡献是,对基于事件的差异估计进行新颖的对比最大化配方,对精确最大化对比并找到最佳差异值的分支和约束算法。 GPU加速进行以加速全球算法。另一个贡献是一个新的数据集,包含来自通风着陆的实际事件流,用于测试和测定我们的方法。由于全球的优化,我们的算法比其他基于事件的偏差估测算法或基于事件的光流法更有能力恢复真正的差异。有了GPU加速,我们的方法也实现了竞争性运行时间。