Existing works on functional data classification focus on the construction of classifiers that achieve perfect classification in the sense that classification risk converges to zero asymptotically. In practical applications, perfect classification is often impossible since the optimal Bayes classifier may have asymptotically nonzero risk. Such a phenomenon is called as imperfect classification. In the case of Gaussian functional data, we exploit classification problem in imperfect classification scenario. Sharp convergence rates for minimax excess risk are derived when data functions are either fully observed or discretely observed. Easily implementable classifiers based on discriminant analysis are proposed which are proven to achieve minimax optimality. In discretely observed case, we discover a critical sampling frequency that governs the sharp convergence rates. The proposed classifiers perform favorably in finite-sample applications, as we demonstrate through comparisons with other functional classifiers in simulations and one real data application.
翻译:功能性数据分类的现有工作侧重于构建能够实现完美分类的分类器,其含义是分类风险会逐渐接近于零,因此,在实际应用中,完全分类往往是不可能的,因为最佳的贝耶斯分类器可能具有无现成的非零风险。这种现象被称为不完善的分类。在高斯功能性数据的情况下,我们在不完善的分类假设中利用分类问题。当数据功能得到完全观察或独立观察时,小型最大超重风险的趋同率就会非常高。根据差异性分析提出的易于执行的分类器被证明能够实现最小化的最佳性。在不连续观测的情况下,我们发现了一个关键取样频率,可以调节急剧的趋同率。在有限抽样应用中,我们通过在模拟和一次真实数据应用中与其他功能性分类器进行比较而证明,拟议的分类器在有限的抽样应用中表现优异。