Recommenders are central in many applications today. The most effective recommendation schemes, such as those based on collaborative filtering (CF), exploit similarities between user profiles to make recommendations, but potentially expose private data. Federated learning and decentralized learning systems address this by letting the data stay on user's machines to preserve privacy: each user performs the training on local data and only the model parameters are shared. However, sharing the model parameters across the network may still yield privacy breaches. In this paper, we present REX, the first enclave-based decentralized CF recommender. REX exploits Trusted execution environments (TEE), such as Intel software guard extensions (SGX), that provide shielded environments within the processor to improve convergence while preserving privacy. Firstly, REX enables raw data sharing, which ultimately speeds up convergence and reduces the network load. Secondly, REX fully preserves privacy. We analyze the impact of raw data sharing in both deep neural network (DNN) and matrix factorization (MF) recommenders and showcase the benefits of trusted environments in a full-fledged implementation of REX. Our experimental results demonstrate that through raw data sharing, REX significantly decreases the training time by 18.3x and the network load by 2 orders of magnitude over standard decentralized approaches that share only parameters, while fully protecting privacy by leveraging trustworthy hardware enclaves with very little overhead.
翻译:最有效的建议方案,例如基于协作过滤(CF),利用用户概况之间的相似性来提出建议,但有可能暴露私人数据。 联邦学习和分散学习系统通过让数据留在用户的机器上以维护隐私来解决这个问题:每个用户都进行当地数据培训,只有模型参数可以共享。然而,在网络中共享模型参数仍可能造成隐私侵犯。在本文件中,我们介绍了第一个基于飞地的分散化(CF)建议者REX。REX利用了信任的执行环境(TEE),例如Intel软件保护扩展(SGX),这些环境在流程中提供屏蔽环境以改善趋同,同时保护隐私。首先,REX使得原始数据共享能够最终加快趋同并减少网络负荷。第二,REX充分保护隐私。我们分析了原始数据共享在深线网络(DNNN)和矩阵要素化(MFM)中的影响,并展示了信任环境在全面实施REX中的益处。我们的实验结果表明,通过原始数据共享,REX使分散的系统能大大降低安全度,同时通过分散化的系统压载量,通过第18.3号标准级的硬基底架将安全度降低。