In this paper, a low parameter deep learning framework utilizing the Non-metric Multi-Dimensional scaling (NMDS) method, is proposed to recover the 3D shape of 2D landmarks on a human face, in a single input image. Hence, NMDS approach is used for the first time to establish a mapping from a 2D landmark space to the corresponding 3D shape space. A deep neural network learns the pairwise dissimilarity among 2D landmarks, used by NMDS approach, whose objective is to learn the pairwise 3D Euclidean distance of the corresponding 2D landmarks on the input image. This scheme results in a symmetric dissimilarity matrix, with the rank larger than 2, leading the NMDS approach toward appropriately recovering the 3D shape of corresponding 2D landmarks. In the case of posed images and complex image formation processes like perspective projection which causes occlusion in the input image, we consider an autoencoder component in the proposed framework, as an occlusion removal part, which turns different input views of the human face into a profile view. The results of a performance evaluation using different synthetic and real-world human face datasets, including Besel Face Model (BFM), CelebA, CoMA - FLAME, and CASIA-3D, indicates the comparable performance of the proposed framework, despite its small number of training parameters, with the related state-of-the-art and powerful 3D reconstruction methods from the literature, in terms of efficiency and accuracy.
翻译:在本文中,一个低参数深层次学习框架使用非计量多维缩放法(NMDS),建议用一个单一输入图像来恢复人脸上2D里程碑的3D形状。因此,首次使用NMDS方法从2D里程碑空间到相应的3D形状空间进行测绘。一个深神经网络学习了2D里程碑之间的对相差异,NMDS方法使用2D标志,目的是学习输入图像上相应的2D标志的对称 3D Eucliidean距离。这个方案的结果是,在2级以上的人脸上恢复了2D标志的3D形状的3D形状。因此,NMDS方法首次用于从2D里程碑空间到相应的3D形状空间进行适当的恢复。在所呈现的图像和拟议图像形成过程中,如透视图像图像的隐蔽,我们考虑在拟议框架中的自动编码部分,目的是从输入的2D的2D标志中将不同的输入观点转换成一个对立的精确度矩阵。这个方案的结果是:SIMA的模型,包括了不同的合成和CAFMA的相关数据模型,以及CMA的相关模型,包括了不同的合成数据,以及CA-CA-CA-C-C-C-C-CA-D-D-CS-CA-C-C-D-D-C-C-C-C-C-D-D-C-C-C-C-C-D-D-C-C-C-S-D-C-C-C-C-C-C-C-C-D-D-D-D-D-S-S-C-D-D-D-D-C-C-S-S-S-S-S-S-S-S-C-C-S-S-S-D-S-S-S-S-S-S-I-I-I-I-I-S-S-S-S-S-S-S-S-S-S-D-D-S-D-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S