By combining Federated Learning with Differential Privacy, it has become possible to train deep models while taking privacy into account. Using Local Differential Privacy (LDP) does not require trust in the server, but its utility is limited due to strong gradient perturbations. On the other hand, client-level Central Differential Privacy (CDP) provides a good balance between the privacy and utility of the trained model, but requires trust in the central server since they have to share raw gradients. We propose OLIVE, a system that can benefit from CDP while eliminating the need for trust in the server as LDP achieves, by using Trusted Execution Environment (TEE), which has attracted much attention in recent years. In particular, OLIVE provides an efficient data oblivious algorithm to minimize the privacy risk that can occur during aggregation in a TEE even on a privileged untrusted server. In this work, firstly, we design an inference attack to leak training data privacy from index information of gradients which can be obtained by side channels in a sparsified gradients setting, and demonstrate the attack's effectiveness on real world dataset. Secondly, we propose a fully-oblivious but efficient algorithm that keeps the memory access patterns completely uniform and secure to protect privacy against the designed attack. We also demonstrate that our method works practically by various empirical experiments. Our experimental results show our proposed algorithm is more efficient compared to state-of-the-art general-purpose Oblivious RAM, and can be a practical method in the real-world scales.
翻译:通过将联邦学习与差异隐私相结合,已经有可能在考虑隐私的同时培养深层次模型。使用地方差异隐私(LDP)不需要对服务器的信任,但其效用因强大的梯度扰动而受到限制。另一方面,客户一级的中央差异隐私(CDP)在经过培训的模式的隐私和实用性之间提供了良好的平衡,但需要信任中央服务器,因为它们必须共享原始梯度。我们建议使用可互换的系统,这个系统可以从CDP中受益,同时随着LDP的实现而消除对服务器的信任,而无需对服务器的信任,因为LDP近年来已经吸引了很大的注意力。特别是,LIVive提供了一种有效的数据模糊的算法,以尽可能减少在TEE合并期间可能发生的隐私风险,即使是在特权不可靠的服务器上。在这项工作中,我们设计了一种推断攻击性攻击性攻击性攻击性攻击性攻击性攻击性攻击性攻击性攻击性攻击性攻击性攻击性指数信息,而这种攻击性攻击性攻击性攻击性攻击性攻击性攻击性攻击性攻击性攻击性攻击性攻击性在过去几年中引起人们注意的可信程度的精确度,我们所设计的甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚甚。