We study the problem of selecting features associated with extreme values in high dimensional linear regression. Normally, in linear modeling problems, the presence of abnormal extreme values or outliers is considered an anomaly which should either be removed from the data or remedied using robust regression methods. In many situations, however, the extreme values in regression modeling are not outliers but rather the signals of interest; consider traces from spiking neurons, volatility in finance, or extreme events in climate science, for example. In this paper, we propose a new method for sparse high-dimensional linear regression for extreme values which is motivated by the Subbotin, or generalized normal distribution, which we call the extreme value linear regression model. For our method, we utilize an $\ell_p$ norm loss where $p$ is an even integer greater than two; we demonstrate that this loss increases the weight on extreme values. We prove consistency and variable selection consistency for the extreme value linear regression with a Lasso penalty, which we term the Extreme Lasso, and we also analyze the theoretical impact of extreme value observations on the model parameter estimates using the concept of influence functions. Through simulation studies and a real-world data example, we show that the Extreme Lasso outperforms other methods currently used in the literature for selecting features of interest associated with extreme values in high-dimensional regression.
翻译:通常,在线性模型问题中,异常极端值或异常值的存在被认为是一种异常现象,应该从数据中删除,或者使用强力回归方法加以补救。然而,在许多情况下,回归模型的极端值不是外值,而是感兴趣的信号;我们研究高维线回归中与极端值相关的特征。在本文中,我们提出了一种新的方法,用于稀疏高维线性回归,这是由Subbotin或普遍正常分布驱动的极端值驱动的,我们称之为极端值线性回归模型。对于我们的方法,我们使用一个$\ell_p$的标准损失,其中美元甚至整数大于2美元;我们证明这一损失增加了极端值的重量。我们用Lasso惩罚来证明极端值线性回归的一致性和可变选择的一致性,我们称之为极端拉索,我们还用影响函数的概念来分析极端值观察对模型参数估计的理论影响,我们称之为极端线性回归模型的理论影响模型。对于我们的方法,我们使用一个$ell_p$p$的标准损失,我们通过模拟研究,用真实的极端水平数据特征来选择其他的回归方法,我们展示了当前在极端水平文献中所使用的利息。