Building on previous research of Chi and Chi (2022), the current paper revisits estimation in robust structured regression under the $\text{L}_2\text{E}$ criterion. We adopt the majorization-minimization (MM) principle to design a new algorithm for updating the vector of regression coefficients. Our sharp majorization achieves faster convergence than the previous alternating proximal gradient descent algorithm (Chi and Chi, 2022). In addition, we reparameterize the model by substituting precision for scale and estimate precision via a modified Newton's method. This simplifies and accelerates overall estimation. We also introduce distance-to-set penalties to allow constrained estimation under nonconvex constraint sets. This tactic also improves performance in coefficient estimation and structure recovery. Finally, we demonstrate the merits of our improved tactics through a rich set of simulation examples and a real data application.
翻译:在以前对Chi和Chi的研究(2022年)的基础上,本文件在$\text{L ⁇ 2\text{E}标准下重新审视了结构回归的强力估计。我们采用了多数化-最小化(MM)原则来设计更新回归系数矢量的新算法。我们的突出多数化比先前交替的精度梯度下降算法(Chi和Chi,2022年)更快地实现趋同。此外,我们通过修改牛顿方法,用精确度取代精确度和估计精确度,对模型进行重新校正。这简化并加快了总体估计。我们还引入了距离到设定的处罚,允许在非convex限制下进行限制估算。这种策略还提高了系数估计和结构恢复的绩效。最后,我们通过丰富的模拟范例和真实的数据应用,展示了我们改进策略的优点。