This paper is concerned with a partially linear semiparametric regression model with an unknown regression coefficient, an unknown nonparametric function for the non-linear component, and an unobservable Gaussian distributed random error. We consider the applications of the least squares to semiparametric regression and particularly present an adaptive lasso penalized least squares (PLS) procedure to select the regression coefficient. Different from almost all the numerical methods in previous literature, this paper concentrates on the corresponding dual problem. We observe that the dual problem consists of a smooth strongly convex function and an indicator function, so it can be solved by the semismooth Newton augmented Lagrangian (SSNAL) algorithm. Besides, a strongly semismooth nonlinear system is involved per-iteration, which can be solved by the semismooth Newton by taking full use of the structure of proximal mappings. We show that this implemented algorithm offers a notable computational advantage in statistical regression inference and the sequence generated by the method admits fast local convergence rate under some assumptions. Numerical experiments on some simulation data and real data are conducted, and the performance comparisons with ADMM to demonstrate the effectiveness of PLS and the progressiveness of SSNAL are also included.
翻译:本文关注的是部分线性半参数回归模型,其回归系数未知,非线性成分的未知非参数函数,以及不可观测的高斯分布随机错误。我们考虑最小方对半对数回归的应用,特别是提出一个适应性拉索惩罚最小方(PLS)程序,以选择回归系数。与以往文献中几乎所有数字方法不同,本文集中研究相应的双重问题。我们观察到,双重问题包括一个平稳的强烈曲线函数和一个指标函数,因此它可以通过半mooth Newton 增强拉格兰加法(SSNAL)算法加以解决。此外,一个强准半摩特非线性非线性系统涉及一次回归,可以通过半摩特 Newton 程序, 来选择回归系数。与以往文献中几乎所有的数字方法不同的是,本文件集中研究了相应的双重问题。我们发现,这一应用的算法在统计回归率和该方法生成的顺序中提供了明显的计算优势,在某些假设下,它可以由半线性牛增(SSNAL)增强的模拟数据和实际数据的数值实验也包括了ADM的进度比较。