The advantages of event-sensing over conventional sensors (e.g., higher dynamic range, lower time latency, and lower power consumption) have spurred research into machine learning for event data. Unsurprisingly, deep learning has emerged as a competitive methodology for learning with event sensors; in typical setups, discrete and asynchronous events are first converted into frame-like tensors on which standard deep networks can be applied. However, over-fitting remains a challenge, particularly since event datasets remain small relative to conventional datasets (e.g., ImageNet). In this paper, we introduce EventDrop, a new method for augmenting asynchronous event data to improve the generalization of deep models. By dropping events selected with various strategies, we are able to increase the diversity of training data (e.g., to simulate various levels of occlusion). From a practical perspective, EventDrop is simple to implement and computationally low-cost. Experiments on two event datasets (N-Caltech101 and N-Cars) demonstrate that EventDrop can significantly improve the generalization performance across a variety of deep networks.
翻译:与常规传感器相比,事件感测的优势(例如,较高的动态范围、较低的时间间隔和较低的电能消耗)刺激了对事件数据机学的研究。令人惊讶的是,深层次的学习已成为与事件感应器一起学习的一种竞争性方法;在典型的设置中,离散和不同步的事件首先转化成类似框架的发压器,可以使用标准的深层网络。然而,过度配置仍然是一个挑战,特别是因为事件数据集相对于常规数据集(例如,图像Net)而言仍然很小。在本论文中,我们引入了事件Drop,这是增加无同步事件数据的新方法,可以改进深层模型的通用化。我们通过采用各种战略选择的事件,能够增加培训数据的多样性(例如,模拟各种程度的封闭)。从实际角度看,事件Drop很容易实施和计算成本低廉。对两个事件数据集(N-Caltech101和N-Cars)的实验表明,事件Draprop可以大大改进各种深层网络的一般性表现。