A novel feature selection model via orthogonal canonical correlation analysis with the $(2,1)$-norm regularization is proposed, and the model is solved by a practical NEPv approach (nonlinear eigenvalue problem with eigenvector dependency), yielding a feature selection method named OCCA-FS. It is proved that OCCA-FS always produces a sequence of approximations with monotonic objective values and is globally convergent. Extensive numerical experiments are performed to compare OCCA-FS against existing feature selection methods. The numerical results demonstrate that OCCA-FS produces superior classification performance and often comes out on the top among all feature selection methods in comparison.
翻译:暂无翻译