Anomaly detection aims to separate anomalies from normal samples, and the pretrained network is promising for anomaly detection. However, adapting the pretrained features would be confronted with the risk of pattern collapse when finetuning on one-class training data. In this paper, we propose an anomaly detection framework called constrained adaptive projection with pretrained features (CAP). Combined with pretrained features, a simple linear projection head applied on a specific input and its k most similar pretrained normal representations is designed for feature adaptation, and a reformed self-attention is leveraged to mine the inner-relationship among one-class semantic features. A loss function is proposed to avoid potential pattern collapse. Concretely, it considers the similarity between a specific data and its corresponding adaptive normal representation, and incorporates a constraint term slightly aligning pretrained and adaptive spaces. Our method achieves state-ofthe-art anomaly detection performance on semantic anomaly detection and sensory anomaly detection benchmarks including 96.5% AUROC on CIFAR- 100 dataset, 97.0% AUROC on CIFAR-10 dataset and 89.9% AUROC on MvTec dataset.
翻译:异常探测的目的是将异常现象与正常样本分开,而且预先培训的网络有希望发现异常现象。然而,在对单级培训数据进行微调时,调整预先培训的特征将面临模式崩溃的风险。在本文件中,我们提议了一个异常探测框架,称为限制适应性预测,带有预先培训的特点(CAP)。结合预先培训的特点,设计了一个简单的线性投影头,用于特定输入,其最类似的预先培训的正常显示,是为了适应特征,并且利用经过改革的自我注意,将单级语义特征之间的内缘关系埋入地雷中。提议了一个损失功能,以避免潜在模式崩溃。具体地说,它考虑到特定数据与其相应的适应性正常表述的相似性,并纳入了一个约束性术语,即对预先培训和适应性空间略作调整。我们的方法在语义异常探测和感官异常探测基准方面达到了最先进的异常性探测性能,其中包括:CFAR-100数据集的96.5% AUROC,CIFAR-10数据集的97.0% AUROC和MVTec数据集的89.9。