Recent years have seen considerable progress in the field of Anomaly Detection but at the cost of increasingly complex training pipelines. Such techniques require large amounts of training data, resulting in computationally expensive algorithms. We propose Few Shot anomaly detection (FewSOME), a deep One-Class Anomaly Detection algorithm with the ability to accurately detect anomalies having trained on 'few' examples of the normal class and no examples of the anomalous class. We describe FewSOME to be of low complexity given its low data requirement and short training time. FewSOME is aided by pretrained weights with an architecture based on Siamese Networks. By means of an ablation study, we demonstrate how our proposed loss, 'Stop Loss', improves the robustness of FewSOME. Our experiments demonstrate that FewSOME performs at state-of-the-art level on benchmark datasets MNIST, CIFAR-10, F-MNIST and MVTec AD while training on only 30 normal samples, a minute fraction of the data that existing methods are trained on. Most notably, we found that FewSOME outperforms even highly complex models in the setting where only few examples of the normal class exist. Moreover, our extensive experiments show FewSOME to be robust to contaminated datasets. We also report F1 score and Balanced Accuracy in addition to AUC as a benchmark for future techniques to be compared against.
翻译:近些年来,在异常探测领域取得了相当大的进展,但花费了越来越复杂的培训管道。这些技术需要大量的培训数据,从而导致计算费用昂贵的算法。我们建议少许拍摄异常现象检测(FewSOME),这是一种深为单类单类异常检测算法,能够准确检测异常现象,能够对普通类的“缝合”实例进行培训,没有异常类的例子。我们说,由于数据要求低,培训时间短,少微SOME的复杂程度较低。很少SOME需要大量的培训数据,以西亚马斯网络为基础的结构为辅助。通过减缩研究,我们展示了我们拟议的损失“停止损失”是如何提高少数SOME的稳健性的。我们的实验表明,在标准数据集的“缝合”、CIFAR-10、F-MNMIST和MVTec AD(MVTec AD)中,在仅30个正常样本的培训中,只是现有方法的一小部分。我们发现,远比亚马斯马斯-ME(SO)更像一个非常复杂的模型。