Although homomorphic encryption can be incorporated into neural network layers for securing machine learning tasks, such as confidential inference over encrypted data samples and encrypted local models in federated learning, the computational overhead has been an Achilles heel. This paper proposes a caching protocol, namely CHEM, such that tensor ciphertexts can be constructed from a pool of cached radixes rather than carrying out expensive encryption operations. From a theoretical perspective, we demonstrate that CHEM is semantically secure and can be parameterized with straightforward analysis under practical assumptions. Experimental results on three popular public data sets show that adopting CHEM only incurs sub-second overhead and yet reduces the encryption cost by 48%--89% for encoding input data samples in confidential inference and 67%--87% for encoding local models in federated learning, respectively.
翻译:虽然同质加密可以纳入神经网络层,以确保机器学习任务的安全,例如加密数据样本的保密推断和联合学习的加密本地模型,但计算间接成本却是一个致命的脚跟。本文提议了一种缓存协议,即CHEM,这样可以从一个隐藏的射线交换器库中建造高压密码,而不是进行昂贵的加密操作。从理论的角度来看,我们证明CHEM是安全的,可以在实际假设下进行直接的分析作为参数。 三个受欢迎的公共数据集的实验结果表明,采用CHEM只产生二次间接成本,但将编码输入数据样本的加密成本分别降低48%至89%和67%至87%,用于在保密推断中将本地模型编码为联合学习。