One-class classification (OCC) is the problem of deciding whether an observed sample belongs to a target class or not. We consider the problem of learning an OCC model when the dataset available at the learning stage contains only samples from the target class. We aim at obtaining a classifier that performs as the generalized likelihood ratio test (GLRT), which is a well-known and provably optimal (under specific assumptions) classifier when the statistic of the target class is available. To this end, we consider both the multilayer perceptron neural network (NN) and the support vector machine (SVM) models. They are trained as two-class classifiers using an artificial dataset for the alternative class, obtained by generating random samples, uniformly over the domain of the target-class dataset. We prove that, under suitable assumptions, the models converge (with a large dataset) to the GLRT. Moreover, we show that the one-class least squares SVM (OCLSSVM) at convergence performs as the GLRT, with a suitable transformation function. Lastly, we compare the obtained solutions with the autoencoder (AE) classifier, which does not in general provide the GLRT
翻译:单级分类(OCC)是决定观测到的样本是否属于目标类别的问题。当学习阶段的数据集只包含目标类的样本时,我们考虑学习OCC模型的问题。我们的目标是获得一个作为通用概率比测试(GLRT)的分类器,该分类器在目标类的统计可用时是众所周知的和可以想象的最佳(根据具体假设)分类器。为此,我们认为多层过敏导神经网络(NNN)和辅助矢量机(SVM)模型都属于多层过敏导神经网络(OCLSSVM),它们被训练为使用替代类别人为的分类器,通过生成随机抽样获得的替代类别的人造数据集。我们证明,在适当假设下,这些模型(与大数据集)与GLRT相融合。此外,我们显示,处于趋同状态的一等最低的SVM(OCLSSVM)作为GLRT,具有适当的转换功能。最后,我们把获得的解决方案与一般的AGLL相比,没有提供一般的GL。