In many applications, sparse and blocky coefficients often occur in regression and classification problems. The fused Lasso was designed to recover these sparse structured features especially when the design matrix encounters the situation of ultrahigh dimension. Quantile loss is well known as a robust loss function in regression and classification. In this paper, we combine quantile loss and fused Lasso penalty together to produce quantile fused Lasso which can achieve sparse and blocky feature selection in both regression and classification. Interestingly, our proposed model has the unified optimization formula for regression and classification. For ultrahigh dimensional collected data, we derive multi-block linearized alternating direction method of multipliers (LADMM) to deal with it. Moreover, we prove convergence and derive convergence rates of the proposed LADMM algorithm through an elegant method. Note that the algorithm can be easily extended to solve many existing fused Lasso models. Finally, we present some numerical results for several synthetic and real world examples, which illustrate the robustness, scalability, and accuracy of the proposed method.
翻译:暂无翻译