This paper proposes new algorithms for the metric learning problem. We start by noticing that several classical metric learning formulations from the literature can be viewed as modified covariance matrix estimation problems. Leveraging this point of view, a general approach, called Robust Geometric Metric Learning (RGML), is then studied. This method aims at simultaneously estimating the covariance matrix of each class while shrinking them towards their (unknown) barycenter. We focus on two specific costs functions: one associated with the Gaussian likelihood (RGML Gaussian), and one with Tyler's M -estimator (RGML Tyler). In both, the barycenter is defined with the Riemannian distance, which enjoys nice properties of geodesic convexity and affine invariance. The optimization is performed using the Riemannian geometry of symmetric positive definite matrices and its submanifold of unit determinant. Finally, the performance of RGML is asserted on real datasets. Strong performance is exhibited while being robust to mislabeled data.
翻译:本文为衡量学习问题提出了新的算法。 我们首先注意到文献中的一些古典指标学习配方可被视为修改的共变矩阵估算问题。 然后研究如何利用这个观点, 即一般方法, 称为强度几何计量学习( RGML ) 。 这种方法旨在同时估计每个班级的共变矩阵, 同时将其缩小到( 未知的) 礼节中心。 我们侧重于两个具体的成本功能: 一个与高山可能性( RGML Gaussian ) 有关, 一个与泰勒的M - 估测仪( RGML Tyler) 有关。 在这两种功能中, 中选器都与里曼距离有关, 里曼距离具有良好的大地等同性和亲和性特性。 优化是使用里曼语对正数确定矩阵及其单位决定因素的亚质值。 最后, RGML 的性能在真实的数据集中被肯定。 。 在对错误标签数据进行精准的同时, 表现得力。