Training large-scale CNNs that during inference can be run under Homomorphic Encryption (HE) is challenging due to the need to use only polynomial operations. This limits HE-based solutions adoption. We address this challenge and pioneer in providing a novel training method for large polynomial CNNs such as ResNet-152 and ConvNeXt models, and achieve promising accuracy on encrypted samples on large-scale dataset such as ImageNet. Additionally, we provide optimization insights regarding activation functions and skip-connection latency impacts, enhancing HE-based evaluation efficiency. Finally, to demonstrate the robustness of our method, we provide a polynomial adaptation of the CLIP model for secure zero-shot prediction, unlocking unprecedented capabilities at the intersection of HE and transfer learning.
翻译:暂无翻译