Gradient boosting decision tree (GBDT) is a widely used ensemble algorithm in the industry. Its vertical federated learning version, SecureBoost, is one of the most popular algorithms used in cross-silo privacy-preserving modeling. As the area of privacy computation thrives in recent years, demands for large-scale and high-performance federated learning have grown dramatically in real-world applications. In this paper, to fulfill these requirements, we propose SecureBoost+ that is both novel and improved from the prior work SecureBoost. SecureBoost+ integrates several ciphertext calculation optimizations and engineering optimizations. The experimental results demonstrate that Secureboost+ has significant performance improvements on large and high dimensional data sets compared to SecureBoost. It makes effective and efficient large-scale vertical federated learning possible.
翻译:渐进式提升决策树(GBDT)是该行业广泛使用的混合算法。 它的垂直联合学习版本“ SecureBoost ” 是跨硅隐私保护模型中最受欢迎的算法之一。 近年来,随着隐私计算领域的兴旺发展,大规模和高性能联合学习的需求在现实世界应用中急剧增长。 为了满足这些要求,我们在此文件中提议“ SecureBoost + ”, 它与先前的工作“ SecurityBoost” 相比是新颖的,并得到了改进。 “ SecurityBoost+” 整合了数种密码计算优化和工程优化。实验结果显示“ Securebowst+”与“SecureBoost”相比,在大型和高维数据集上取得了显著的性能改进。 它使得高效的大型垂直联合学习成为可能。