We define a new second-order retraction map for statistical models. We also compute retractions using homotopy continuation. Riemannian optimization uses local methods to solve optimization problems whose constraint set is a smooth manifold. A linear step along some descent direction usually leaves the constraint set, and hence retraction maps are used to approximate the exponential map, and return to the manifold. For many common matrix manifolds, retraction maps are available, with more or less explicit formulas. For other implicitly-defined manifolds or varieties, suitable retraction maps are difficult to compute. We therefore develop Algorithm 1, which uses homotopy continuation to compute the Euclidean distance retraction for any implicitly-defined submanifold of $R^n$. We also consider statistical models as Riemannian submanifolds of the probability simplex with the Fisher metric. After defining an analogous maximum likelihood retraction, Algorithm 2 computes it using homotopy continuation. In Theorem 2, we prove that the resulting map is a second-order retraction; with the Levi-Civita connection associated to the Fisher metric, it approximates geodesics to second-order accuracy.
翻译:我们为统计模型定义了一个新的二阶撤回图。 我们还用同质的连续性来计算撤回图。 Riemannian 优化使用本地方法来解决限制设置为平滑的方块的优化问题。 沿某些向下方向的线性步骤通常会留下限制设置, 因此撤回图会用来接近指数地图, 并返回元件。 对于许多共同的矩阵元件, 撤回图是可以找到的, 并有或多或少清晰的公式。 对于其他隐含定义的元件或品种, 合适的撤回图难以计算。 因此, 我们开发了 Alogorithm 1, 使用同质的连续性来计算Euclidean 距离的反射, 以计算任何隐含定义的美元分义值 。 我们还将统计模型视为Riemannian 的概率简单x的亚片段, 与Fisherish Instrual Instruction 相近似的最大可能性后, Algorithm 2 将它配置为它使用同质的延续。 在Therem 2中, 我们证明由此产生的地图是第二阶次级的反移; 与Levi- Civita 相近的精确度。