This paper proposes a method for solving multivariate regression and classification problems using piecewise linear predictors over a polyhedral partition of the feature space. The resulting algorithm that we call PARC (Piecewise Affine Regression and Classification) alternates between (i) solving ridge regression problems for numeric targets, softmax regression problems for categorical targets, and either softmax regression or cluster centroid computation for piecewise linear separation, and (ii) assigning the training points to different clusters on the basis of a criterion that balances prediction accuracy and piecewise-linear separability. We prove that PARC is a block-coordinate descent algorithm that optimizes a suitably constructed objective function, and that it converges in a finite number of steps to a local minimum of that function. The accuracy of the algorithm is extensively tested numerically on synthetic and real-world datasets, showing that the approach provides an extension of linear regression/classification that is particularly useful when the obtained predictor is used as part of an optimization model. A Python implementation of the algorithm described in this paper is available at http://cse.lab.imtlucca.it/~bemporad/parc .
翻译:本文建议了一种方法来解决多变量回归和分类问题。 使用对地貌空间的多元偏差分区的单向线性线性预测器解决多变量回归和分类问题。 由此产生的算法,我们称之为PARC( Piecewith Affine Regresizion and 分类), 其代之法是:(一) 解决数字目标的脊柱回归问题, 绝对目标的软负负回归问题, 以及片面线性线性分离的软负回归或集束中子计算, 以及(二) 根据一个平衡预测准确性和单线性线性线性分离的标准, 将培训点分配给不同的组。 我们证明, PARC 是一种块性协调的基底算法, 优化了适当构建的目标函数, 并且它以一定数量的步骤聚合到该函数的本地最小值。 该算法的准确性在合成和真实世界数据集上得到了广泛的数字测试, 表明该方法提供了线性回归/ 分类的延伸, 当获得的预测器用作优化模型的一部分时, 特别有用。 本文中所描述的算算法的Pythsonon 。