An orthonormal basis matrix $X$ of a subspace ${\cal X}$ is known not to be unique, unless there are some kinds of normalization requirements. One of them is to require that $X^{\rm T}D$ is positive semi-definite, where $D$ is a constant matrix of apt size. It is a natural one in multi-view subspace learning models in which $X$ serves as a projection matrix and is determined by a maximization problem over the Stiefel manifold whose objective function contains and increases with tr$(X^{\rm T}D)$. This paper is concerned with bounding the change in orthonormal basis matrix $X$ as subspace ${\cal X}$ varies under the requirement that $X^{\rm T}D$ stays positive semi-definite. The results are useful in convergence analysis of the NEPv approach (nonlinear eigenvalue problem with eigenvector dependency) to solve the maximization problem.
翻译:一个子空间${\cal X}$的正交基矩阵$X$不是唯一的,除非有一些归一化的要求。其中一个归一化要求是要求$X^\mathrm{T}D$是半正定的,其中$D$是一个固定的矩阵。在多视图子空间学习模型中,$X$作为一个投影矩阵,通过在Stiefel流形上的一个包含并随着tr$(X^\mathrm{T}D)$增加的目标函数的最大化问题来决定。本文关注的是在要求$X^\mathrm{T}D$保持半正定的情况下,子空间${\cal X}$变化时正交基矩阵$X$的变化。这些结果对于解决NEPv(具有特征向量依赖的非线性特征值问题)的最大化问题的收敛分析非常有用。