We apply classical and Bayesian lasso regularizations to a family of models with the presence of mixture and process variables. We analyse the performance of these estimates with respect to ordinary least squares estimators by a simulation study and a real data application. Our results demonstrate the superior performance of Bayesian lasso, particularly via coordinate ascent variational inference, in terms of variable selection accuracy and response optimization.
翻译:暂无翻译