Sufficient dimension reduction (SDR) using distance covariance (DCOV) was recently proposed as an approach to dimension-reduction problems. Compared with other SDR methods, it is model-free without estimating link function and does not require any particular distributions on predictors (see Sheng and Yin, 2013, 2016). However, the DCOV-based SDR method involves optimizing a nonsmooth and nonconvex objective function over the Stiefel manifold. To tackle the numerical challenge, we novelly formulate the original objective function equivalently into a DC (Difference of Convex functions) program and construct an iterative algorithm based on the majorization-minimization (MM) principle. At each step of the MM algorithm, we inexactly solve the quadratic subproblem on the Stiefel manifold by taking one iteration of Riemannian Newton's method. The algorithm can also be readily extended to sufficient variable selection (SVS) using distance covariance. We establish the convergence property of the proposed algorithm under some regularity conditions. Simulation studies show our algorithm drastically improves the computation efficiency and is robust across various settings compared with the existing method. Supplemental materials for this article are available.
翻译:使用距离共差(DCOV)来减少足够维度的方法(SDR)最近被提议作为减少维度问题的一种方法。与其他SDR方法相比,它没有模型,没有估计联系功能,不需要在预测器上作任何特定的分布(见Sheng和Yin,2013年,2016年)。然而,基于DCOV的SDR方法涉及在Stiefel 方块上优化一个非悬浮和非相容的客观功能。为了应对数字挑战,我们新将原始目标功能以等量的方式发展成一个DC(Convex函数的差异)程序,并根据主要化-最小化(MMM)原则建立一个迭代算法。在MM算法的每一个步骤中,我们通过对Riemannian Newton 方块的方法进行一次迭代,不完全解决Stiefel 方块上的四分质子问题。为了应对数字挑战,我们还可以很容易地将算法扩大到足够的变量选择(SVS) 。我们在某些常规条件下建立拟议算法的趋同属性。模拟研究表明,我们的算法极大地改进了计算效率,并且是现有方法的辅助。