3D modeling of non-linear objects from stylized sketches is a challenge even for experts in Computer Graphics (CG). The extrapolation of objects parameters from a stylized sketch is a very complex and cumbersome task. In the present study, we propose a broker system that mediates between the modeler and the 3D modelling software and can transform a stylized sketch of a tree into a complete 3D model. The input sketches do not need to be accurate or detailed, and only need to represent a rudimentary outline of the tree that the modeler wishes to 3D-model. Our approach is based on a well-defined Deep Neural Network (DNN) architecture, we called TreeSketchNet (TSN), based on convolutions and able to generate Weber and Penn parameters that can be interpreted by the modelling software to generate a 3D model of a tree starting from a simple sketch. The training dataset consists of Synthetically-Generated \revision{(SG)} sketches that are associated with Weber-Penn parameters generated by a dedicated Blender modelling software add-on. The accuracy of the proposed method is demonstrated by testing the TSN with both synthetic and hand-made sketches. Finally, we provide a qualitative analysis of our results, by evaluating the coherence of the predicted parameters with several distinguishing features.
翻译:3D 建模来自星状草图的非线性天体建模即使对计算机图形专家(CG)也是一项挑战。 从星状草图中推断物体参数是一个非常复杂和繁琐的任务。 在本研究中,我们提议了一个中介系统,在模型软件和3D建模软件之间进行介质,并将树的立体草图转换成完整的3D模型。输入的草图不需要准确或详细,只需代表建模者希望3D模型的树的原始轮廓即可。我们的方法基于一个明确界定的深神经网络(DNN)结构,我们称之为TreeSketchNet(TSN),基于演化过程,能够产生Weber和Penn的参数,这些参数可以通过建模软件加以解释,从一个简单的草图开始,将树的立体草图转换成一个3D模型。 培训数据集由Synthethetical-Grevision{(SG){(SG)}构成一个与Weber-Pen参数相联系的原始草图。我们的方法是以专门的Blender建模模型生成的参数生成的参数,通过一个合成的合成的精度测试,最后由SN-SNAximal分析提供了一个测试结果的精度的精度的精度的精度。