Riemannian optimization uses local methods to solve optimization problems whose constraint set is a smooth manifold. A linear step along some descent direction usually leaves the constraints, and hence retraction maps are used to approximate the exponential map and return to the manifold. For many common matrix manifolds, retraction maps are available, with more or less explicit formulas. For implicitly-defined manifolds, suitable retraction maps are difficult to compute. We therefore develop an algorithm which uses homotopy continuation to compute the Euclidean distance retraction for any implicitly-defined submanifold of R^n, and prove convergence results. We also consider statistical models as Riemannian submanifolds of the probability simplex with the Fisher metric. Replacing Euclidean distance with maximum likelihood results in a map which we prove is a retraction. In fact, we prove the retraction is second-order; with the Levi-Civita connection associated to the Fisher metric, it approximates geodesics to second-order accuracy.
翻译:Riemannian 优化使用本地方法解决优化问题, 其限制设置为平滑的方块。 沿着某些下行方向的线性步骤通常会留下限制, 因此撤回图会用来接近指数地图并返回元件。 对于许多通用矩阵元件来说, 撤回图是有的, 并有或多或少明确的公式。 对于隐含定义的元件, 合适的撤回图难以计算。 因此, 我们开发了一种算法, 用同质的继续计算 Euclidean 距离撤回 Rón 任何隐含定义的子段, 并证明结果一致。 我们还将统计模型视为概率简单x 与 Fisherish 度的 Rein 的 Reekmannian 子段。 将 Euclidean 距离和最大可能性结果重新定位在地图上, 我们证明这是一次撤回。 事实上, 我们证明撤回是次顺序; 与Fisheral 度相关的Lev- Civita 连接, 它将大地学比为第二级精确度 。