Abnormality detection is a challenging task due to the dependence on a specific context and the unconstrained variability of practical scenarios. In recent years, it has benefited from the powerful features learnt by deep neural networks, and handcrafted features specialized for abnormality detectors. However, these approaches with large complexity still have limitations in handling long term sequential data (e.g., videos), and their learnt features do not thoroughly capture useful information. Recurrent Neural Networks (RNNs) have been shown to be capable of robustly dealing with temporal data in long term sequences. In this paper, we propose a novel version of Gated Recurrent Unit (GRU), called Single Tunnelled GRU for abnormality detection. Particularly, the Single Tunnelled GRU discards the heavy weighted reset gate from GRU cells that overlooks the importance of past content by only favouring current input to obtain an optimized single gated cell model. Moreover, we substitute the hyperbolic tangent activation in standard GRUs with sigmoid activation, as the former suffers from performance loss in deeper networks. Empirical results show that our proposed optimized GRU model outperforms standard GRU and Long Short Term Memory (LSTM) networks on most metrics for detection and generalization tasks on CUHK Avenue and UCSD datasets. The model is also computationally efficient with reduced training and testing time over standard RNNs.
翻译:异常检测是一项艰巨的任务,因为依赖于特定环境,而且实际情景的变异性不受限制。近年来,它得益于深层神经网络所学的强大特征,以及异常探测器专用手工艺特征。然而,这些非常复杂的方法在处理长期连续数据(例如视频)方面仍然有局限性,而且它们所学的特征并不完全捕捉有用的信息。经常神经网络(RNN)已证明能够长期可靠地处理时间数据。在本文件中,我们提议了一个新型的Gated经常单位(GRU),称为单一隧道式GRU,用于异常检测。特别是,单一隧道式GRU放弃GRU细胞重置重置的重置门,而GRU的重置门只是偏重当前输入,以获得最优化的单一门型细胞模型。此外,我们用标准GRUMU启动超偏斜的超音调调,因为前者在更深的网络中出现性能损失。真实的结果表明,我们提议的GRRU模型模型模型模型比GRMS的G标准测试和LML的短期标准测试和LML的升级测试系统。