In this paper, we present a sharp analysis for a class of alternating projected gradient descent algorithms which are used to solve the covariate adjusted precision matrix estimation problem in the high-dimensional setting. We demonstrate that these algorithms not only enjoy a linear rate of convergence in the absence of convexity, but also attain the optimal statistical rate (i.e., minimax rate). By introducing the generic chaining, our analysis removes the impractical resampling assumption used in the previous work. Moreover, our results also reveal a time-data tradeoff in this covariate adjusted precision matrix estimation problem. Numerical experiments are provided to verify our theoretical results.
翻译:在本文中,我们对一组交替的预测梯度下游算法进行了尖锐的分析,这些算法用于解决高维环境中的共变调整精确矩阵估算问题。我们证明,这些算法不仅在无静态的情况下享有线性趋同率,而且达到了最佳统计率(即微速率 ) 。 通过引入通用链条,我们的分析消除了先前工作中使用的不切实际的重现假设。此外,我们的结果还揭示了这一共变调整精确矩阵估算问题的时间数据交换。提供了数字实验以核实我们的理论结果。