We address the problem of sufficient dimension reduction for feature matrices, which arises often in sensor network localization, brain neuroimaging, and electroencephalography analysis. In general, feature matrices have both row- and column-wise interpretations and contain structural information that can be lost with naive vectorization approaches. To address this, we propose a method called principal support matrix machine (PSMM) for the matrix sufficient dimension reduction. The PSMM converts the sufficient dimension reduction problem into a series of classification problems by dividing the response variables into slices. It effectively utilizes the matrix structure by finding hyperplanes with rank-1 normal matrix that optimally separate the sliced responses. Additionally, we extend our approach to the higher-order tensor case. Our numerical analysis demonstrates that the PSMM outperforms existing methods and has strong interpretability in real data applications.
翻译:为解决这一问题,我们提出了一种方法,称为主要支持矩阵机器(PSMM),用于充分减少基质。PSMM将足够减少维度的问题转换成一系列分类问题,将响应变量分为切片。它有效地利用矩阵结构,找到使用一级和一级正常矩阵的超机体,以最佳方式区分切片反应。此外,我们将我们的方法推广到较高级的拉高器。我们的数字分析表明,PSMMM超越了现有方法,在实际数据应用中具有很强的可解释性。</s>