This paper is concerned with a partially linear semiparametric regression model containing an unknown regression coefficient, an unknown nonparametric function, and an unobservable Gaussian distributed random error. We focus on the case of simultaneous variable selection and estimation with a divergent number of covariates under the assumption that the regression coefficient is sparse. We consider the applications of the least squares to semiparametric regression and particularly present an adaptive lasso penalized least squares (PLS) method to select the regression coefficient. We note that there are many algorithms for PLS in various applications, but they seem to be rarely used in semiparametric regression. This paper focuses on using a semismooth Newton augmented Lagrangian (SSNAL) algorithm to solve the dual of PLS which is the sum of a smooth strongly convex function and an indicator function. At each iteration, there must be a strongly semismooth nonlinear system, which can be solved by semismooth Newton by making full use of the penalized term. We show that the algorithm offers a significant computational advantage, and the semismooth Newton method admits fast local convergence rate. Numerical experiments on some simulation data and real data to demonstrate that the PLS is effective and the SSNAL is progressive.
翻译:本文关注一个部分线性半参数回归模型,其中包含一个未知回归系数、一个未知的非参数函数和一个不可观察的高斯分布式随机错误。 我们侧重于同时选择变量和估算的情况, 假设回归系数稀疏, 假设同时选择和估算的变量数量不同。 我们考虑最小方对半参数回归的应用, 特别是提出一个适应性拉索惩罚最小方(PLS)方法来选择回归系数。 我们注意到, 各种应用中都有许多PLS的算法, 但似乎很少在半参数回归中使用。 本文侧重于使用半斯mooth Newton 增强 Lagrangian (SSNAL) 算法来解决 PLS 的双倍值, 后者是光滑的强convex 函数和指标函数的总和。 我们考虑的是最小偏移的非线性半光谱非线性非线性系统(PLS) 来选择回归系数。 我们发现该算法具有某种重大的计算优势, 而半星牛顿 方法的算法是真实数据模型, 并演示了本地数据快速趋同率。