Traditional frame-based cameras inevitably suffer from motion blur due to long exposure times. As a kind of bio-inspired camera, the event camera records the intensity changes in an asynchronous way with high temporal resolution, providing valid image degradation information within the exposure time. In this paper, we rethink the eventbased image deblurring problem and unfold it into an end-to-end two-stage image restoration network. To effectively fuse event and image features, we design an event-image cross-modal attention module applied at multiple levels of our network, which allows to focus on relevant features from the event branch and filter out noise. We also introduce a novel symmetric cumulative event representation specifically for image deblurring as well as an event mask gated connection between the two stages of our network which helps avoid information loss. At the dataset level, to foster event-based motion deblurring and to facilitate evaluation on challenging real-world images, we introduce the Real Event Blur (REBlur) dataset, captured with an event camera in an illumination controlled optical laboratory. Our Event Fusion Network (EFNet) sets the new state of the art in motion deblurring, surpassing both the prior best-performing image-based method and all event-based methods with public implementations on the GoPro dataset (by up to 2.47dB) and on our REBlur dataset, even in extreme blurry conditions. The code and our REBlur dataset will be made publicly available.
翻译:传统的基于框架的相机不可避免地会因为长期曝光时间而受到运动模糊的影响。 作为一种生物启发的相机,事件相机记录了以非同步方式发生的强度变化,且具有高时间分辨率,在曝光时间里提供了有效的图像降解信息。在本文中,我们重新思考基于事件的图像分解问题,并将其放入一个端到端至端的两阶段图像恢复网络。为了有效地结合事件和图像特征,我们设计了一个在网络多个级别上应用的事件模拟交叉关注模块,该模块可以侧重于事件分支的相关特征,并过滤噪音。我们还引入了一个新的对称性累积事件代表,专门用于图像分流以及我们网络的两个阶段之间的事件遮盖式连接,这有助于避免信息丢失。在数据集层面,为了促进基于事件的分解,并促进对挑战性真实世界图像的评估,我们推出了真实事件布卢尔(REB)数据集,该数据集以事件监控控制光学实验室中的事件摄像器为主。我们的活动整合网络(EFnal Flusional Net)专门为图像分解显示我们之前的图像执行方法。