Outsourced computation for neural networks allows users access to state of the art models without needing to invest in specialized hardware and know-how. The problem is that the users lose control over potentially privacy sensitive data. With homomorphic encryption (HE) computation can be performed on encrypted data without revealing its content. In this systematization of knowledge, we take an in-depth look at approaches that combine neural networks with HE for privacy preservation. We categorize the changes to neural network models and architectures to make them computable over HE and how these changes impact performance. We find numerous challenges to HE based privacy-preserving deep learning such as computational overhead, usability, and limitations posed by the encryption schemes.
翻译:神经网络的外源计算使用户无需投资于专门硬件和诀窍就能获取最新的最新模型。 问题在于用户对潜在的隐私敏感数据失去控制。 由于同质加密(HE)的计算可以在不披露其内容的情况下对加密数据进行。 在对知识进行系统化的过程中,我们深入研究将神经网络与HE结合起来保护隐私的方法。 我们分类了神经网络模型和结构的变化,使它们能比HE兼容,以及这些变化如何影响性能。 我们发现HE基于隐私的深层学习面临诸多挑战,例如计算间接费用、可用性和加密计划带来的限制。