This paper proposes a non-interactive end-to-end solution for secure fusion and matching of biometric templates using fully homomorphic encryption (FHE). Given a pair of encrypted feature vectors, we perform the following ciphertext operations, i) feature concatenation, ii) fusion and dimensionality reduction through a learned linear projection, iii) scale normalization to unit $\ell_2$-norm, and iv) match score computation. Our method, dubbed HEFT (Homomorphically Encrypted Fusion of biometric Templates), is custom-designed to overcome the unique constraint imposed by FHE, namely the lack of support for non-arithmetic operations. From an inference perspective, we systematically explore different data packing schemes for computationally efficient linear projection and introduce a polynomial approximation for scale normalization. From a training perspective, we introduce an FHE-aware algorithm for learning the linear projection matrix to mitigate errors induced by approximate normalization. Experimental evaluation for template fusion and matching of face and voice biometrics shows that HEFT (i) improves biometric verification performance by 11.07% and 9.58% AUROC compared to the respective unibiometric representations while compressing the feature vectors by a factor of 16 (512D to 32D), and (ii) fuses a pair of encrypted feature vectors and computes its match score against a gallery of size 1024 in 884 ms. Code and data are available at https://github.com/human-analysis/encrypted-biometric-fusion
翻译:本文建议使用完全同质加密(FHE)对生物鉴别模板进行安全混集和匹配的非互动端对端解决方案。 鉴于一对加密特性矢量,我们使用以下加密特性矢量进行加密操作,(一) 特征共解,(二) 通过学习的线性投影,(三) 将聚和维度降低到单位=ell_2美元,(四) 匹配评分计算。我们称为HEFT(生物鉴别模板加密混集集)的方法是定制的,目的是克服FHE施加的独特限制,即缺乏对非氨化作业的支持。从推论角度,我们系统地探索不同的数据包装计划,以计算高效线性预测,(二) 聚合和维度递增。从培训角度,我们采用FHE-aware算算法,学习线性预测矩阵,以降低因大致正常化引起的误差。 模板混集和面部和语音生物鉴别技术测试的实验性评估显示,HEFT(一) 将生物鉴别性矢量核查业绩从11.07%和9.58%的递解程度,而将AB-Ribal-deal-deal-dealmaxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx