Partial least square regression (PLSR) is a widely-used statistical model to reveal the linear relationships of latent factors that comes from the independent variables and dependent variables. However, traditional methods \ql{ to solve PLSR models are usually based on the Euclidean space, and easily getting} stuck into a local minimum. To this end, we propose a new method to solve the partial least square regression, named PLSR via optimization on bi-Grassmann manifold (PLSRbiGr). \ql{Specifically, we first leverage} the three-factor SVD-type decomposition of the cross-covariance matrix defined on the bi-Grassmann manifold, converting the orthogonal constrained optimization problem into an unconstrained optimization problem on bi-Grassmann manifold, and then incorporate the Riemannian preconditioning of matrix scaling to regulate the Riemannian metric in each iteration. \ql{PLSRbiGr is validated} with a variety of experiments for decoding EEG signals at motor imagery (MI) and steady-state visual evoked potential (SSVEP) task. Experimental results demonstrate that PLSRbiGr outperforms competing algorithms in multiple EEG decoding tasks, which will greatly facilitate small sample data learning.
翻译:局部最小回归(PLSR)是一个广泛使用的统计模型,用以显示来自独立变量和依附变量的潜在因素的线性关系。然而,解决PLSR模型的传统方法 \ ql{} 通常以欧几里德空间为基础,容易地将}卡在本地最小值中。为此,我们提出一个新的方法来解决部分最小回归,通过优化双格拉斯曼方程式(PLSRbiGr)命名为PLSR。\ ql{PL{具体地说,我们首先利用} 三个因素的SVD型跨变量变异矩阵的三维因SVD型变异。在双格拉斯曼方程式上定义的交叉变异矩阵,将限制优化的问题转换成两格拉斯曼方程式上未受限制的优化问题,然后纳入Riemannian 基底值缩放的先决条件,以调控每层的里曼度测量度。\ ql{PLVSRBiG},通过各种实验在机动图(MI)上解 EEG 和稳定视觉变动的试算方法中显示多式实验结果。