Since the publication of Breiman (2001), Random Forests (RF) have been widely used in both regression and classification. Later on, other forests are also proposed and studied in literature and Mondrian Forests are notable examples built on the Mondrian process; see Lakshminarayanan et al. (2014). In this paper, we propose an ensemble estimator in general statistical learning based on Mondrian Forests, which can be regarded as an extension of RF. This general framework includes many common learning problems, such as least squares regression, least $\ell_1$ regression, quantile regression and classification. Under mild conditions of loss functions, we give the upper bound of the regret/risk function of this forest estimator and show that such estimator is also statistically consistent.
翻译:暂无翻译