Generalized linear models are flexible tools for the analysis of diverse datasets, but the classical formulation requires that the parametric component is correctly specified and the data contain no atypical observations. To address these shortcomings we introduce and study a family of nonparametric full rank and lower rank spline estimators that result from the minimization of a penalized power divergence. The proposed class of estimators is easily implementable, offers high protection against outlying observations and can be tuned for arbitrarily high efficiency in the case of clean data. We show that under weak assumptions these estimators converge at a fast rate and illustrate their highly competitive performance on a simulation study and two real-data examples.
翻译:通用线性模型是分析各种数据集的灵活工具,但古典公式要求正确具体说明参数组成部分,数据没有非典型的观察结果,为克服这些缺点,我们引入并研究一个非对称全级和低级定级定级数的大家庭,这些定级定级数系因尽量减少受罚的权力差异而产生,拟议的测算员类别易于执行,可提供高水平的保护,避免外向观测,在清洁数据的情况下可调整为任意高效率。我们表明,在假设薄弱的情况下,这些估计数以快速速度汇合,并在模拟研究和两个真实数据实例中说明其高度竞争性的表现。