Modern variable selection procedures make use of penalization methods to execute simultaneous model selection and estimation. A popular method is the LASSO (least absolute shrinkage and selection operator), which contains a tuning parameter. This parameter is typically tuned by minimizing the cross-validation error or Bayesian information criterion (BIC) but this can be computationally intensive as it involves fitting an array of different models and selecting the best one. However, we have developed a procedure based on the so-called "smooth IC" (SIC) in which the tuning parameter is automatically selected in one step. We also extend this model selection procedure to the so-called "multi-parameter regression" framework, which is more flexible than classical regression modelling. Multi-parameter regression introduces flexibility by taking account of the effect of covariates through multiple distributional parameters simultaneously, e.g., mean and variance. These models are useful in the context of normal linear regression when the process under study exhibits heteroscedastic behaviour. Reformulating the multi-parameter regression estimation problem in terms of penalized likelihood enables us to take advantage of the close relationship between model selection criteria and penalization. Utilizing the SIC is computationally advantageous, as it obviates the issue of having to choose multiple tuning parameters.
翻译:现代可变选择程序使用惩罚性方法来执行同时的模型选择和估计。一种流行的方法是LASSO(最小绝对缩缩和选择操作员),它包含调试参数。这个参数一般通过减少交叉校验错误或贝叶斯信息标准来调整,但可以计算密集,因为它涉及同时安装一系列不同的模型和选择最佳的模型。然而,我们已经根据所谓的“吸附IC”(SIC)(SIC)(SIC)制定了一个程序,在这个程序中,调试参数是自动在一个步骤中选择的。我们还把这个模型选择程序扩大到所谓的“多参数回归”框架,这个框架比经典回归模型模型模型模型模型模型模型模型模型模型更灵活。多参数回归引入了灵活性,同时考虑到通过多种分布参数(例如平均值和差异性参数)的共变异效应。这些模型在正常线回归的背景下非常有用,因为正在研究的进程中显示超摄氏行为。重新定义多参数回归估计问题从惩罚性的可能性的角度出发,使我们能够利用模型选择标准与刑法化的变现问题之间的密切关系。