In this paper, we propose a gradient boosting algorithm called \textit{adaptive boosting histogram transform} (\textit{ABHT}) for regression to illustrate the local adaptivity of gradient boosting algorithms in histogram transform ensemble learning. From the theoretical perspective, when the target function lies in a locally H\"older continuous space, we show that our ABHT can filter out the regions with different orders of smoothness. Consequently, we are able to prove that the upper bound of the convergence rates of ABHT is strictly smaller than the lower bound of \textit{parallel ensemble histogram transform} (\textit{PEHT}). In the experiments, both synthetic and real-world data experiments empirically validate the theoretical results, which demonstrates the advantageous performance and local adaptivity of our ABHT.
翻译:在本文中,我们提出一种梯度推进算法,称为\ textit{ 适应性提振直方图变换} (\ textit{ ABHT}), 用于回归, 以说明直方图变异共性学习中梯度提振算法的本地适应性。 从理论角度看, 当目标函数位于本地 H\"older 连续空间时, 我们显示我们的ABHT 能够以不同顺畅的顺序过滤区域。 因此, 我们能够证明ABHT 趋同率的上限严格小于\ textit{ parallel 连字符图变异的下限 (\ textit{ pallel commble gegraphy traction} (\ textit{PEHT}) 。 在实验中, 合成数据和真实世界数据实验都以实验方式验证了理论结果, 这显示了我们ABHT 的有利性和地方适应性。