Given a convex and differentiable objective $Q(\M)$ for a real symmetric matrix $\M$ in the positive definite (PD) cone -- used to compute Mahalanobis distances -- we propose a fast general metric learning framework that is entirely projection-free. We first assume that $\M$ resides in a space $\cS$ of generalized graph Laplacian matrices corresponding to balanced signed graphs. $\M \in \cS$ that is also PD is called a graph metric matrix. Unlike low-rank metric matrices common in the literature, $\cS$ includes the important diagonal-only matrices as a special case. The key theorem to circumvent full eigen-decomposition and enable fast metric matrix optimization is Gershgorin disc perfect alignment (GDPA): given $\M \in \cS$ and diagonal matrix $\S$, where $S_{ii} = 1/v_i$ and $\v$ is $\M$'s first eigenvector, we prove that Gershgorin disc left-ends of similarity transform $\B = \S \M \S^{-1}$ are perfectly aligned at the smallest eigenvalue $\lambda_{\min}$. Using this theorem, we replace the PD cone constraint in the metric learning problem with tightest possible linear constraints per iteration, so that the alternating optimization of the diagonal / off-diagonal terms in $\M$ can be solved efficiently as linear programs via the Frank-Wolfe method. We update $\v$ using Locally Optimal Block Preconditioned Conjugate Gradient (LOBPCG) with warm start as entries in $\M$ are optimized successively. Experiments show that our graph metric optimization is significantly faster than cone-projection schemes, and produces competitive binary classification performance.
翻译:以正确定( PD) 矩形中真实的正对称矩阵 $( M) $( M) 美元, 用于计算 Mahalanobis 距离的正正确定( PD) 矩形中, 我们提出一个快速通用的通用学习框架, 完全不投影。 我们首先假设$( M) 位于一个与均衡的签名图形相对的通用图形 Laplacian 矩形中的空格$\ cS美元。 也称为 PD 的本地美元, 也称为 硬度矩阵。 不同于文献中常见的低端矩阵 $( M) 美元, $( C$) 包括重要的直线性硬度优化矩阵。 绕过全部eigen脱色的快速矩阵优化的关键词是 Gershgorin CD: 考虑到$( M) 美元= cS$( 美元 美元) 和 三角矩阵 $( 美元), 美元= 硬度= 硬度= 美元( 美元) 美元, 我们用最小的直径= 左端程序显示, 以最小化的磁盘程序显示, 折变。