Anomalies are ubiquitous in all scientific fields and can express an unexpected event due to incomplete knowledge about the data distribution or an unknown process that suddenly comes into play and distorts the observations. Due to such events' rarity, it is common to train deep learning models on "normal", i.e. non-anomalous, datasets only, thus letting the neural network to model the distribution beneath the input data. In this context, we propose our deep learning approach to the anomaly detection problem named Multi-LayerOne-Class Classification (MOCCA). We explicitly leverage the piece-wise nature of deep neural networks by exploiting information extracted at different depths to detect abnormal data instances. We show how combining the representations extracted from multiple layers of a model leads to higher discrimination performance than typical approaches proposed in the literature that are based neural networks' final output only. We propose to train the model by minimizing the $L_2$ distance between the input representation and a reference point, the anomaly-free training data centroid, at each considered layer. We conduct extensive experiments on publicly available datasets for anomaly detection, namely CIFAR10, MVTec AD, and ShanghaiTech, considering both the single-image and video-based scenarios. We show that our method reaches superior performances compared to the state-of-the-art approaches available in the literature. Moreover, we provide a model analysis to give insight on how our approach works.
翻译:在所有科学领域,异常现象无处不在,可以表达出一个意外事件,因为对数据分布或突然发生并扭曲观察结果的未知过程不完全了解数据分布或未知过程,因此,对“正常”的深层次学习模式(即非异常的数据集)进行培训是常见的。由于这种事件很少,我们通常只对神经网络进行“正常”的深层次学习模式,从而使神经网络能够模拟输入数据下的分布。在这方面,我们建议对异常现象检测问题采取深层次的学习方法,称为多层Layer One-Class分类(MOCCA) 。我们明确利用深神经网络的片断性质,利用从不同深度提取的信息探测异常数据实例。我们展示了从一个模型的多层中提取的表述导致比以神经网络最后输出为基础的文献中建议的典型方法更具有歧视性的表现。我们提议通过尽量减少输入代表与一个参考点之间的$2美元距离,即以非异常方式培训数据,每个深层。我们利用从不同深度提取的神经网络的碎片性质进行广泛的实验,通过利用从不同深度提取的信息来探测异常现象,以探测异常现象,即利用从一个多层中提取数据。我们从一个模型,即CFAR10、MVD-S-SD-SD-S-SD-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S