Recent advances in generative modeling have driven significant progress in text-guided texture synthesis. However, current methods focus on synthesizing texture for single static 3D object, and struggle to handle entire families of shapes, such as those produced by procedural programs. Applying existing methods naively to each procedural shape is too slow to support exploring different parameter configurations at interactive rates, and also results in inconsistent textures across the procedural shapes. To this end, we introduce ProcTex, the first text-to-texture system designed for part-based procedural models. ProcTex enables consistent and real-time text-guided texture synthesis for families of shapes, which integrates seamlessly with the interactive design flow of procedural modeling. To ensure consistency, our core approach is to synthesize texture for a template shape from the procedural model, followed by a texture transfer stage to apply the texture to other procedural shapes via solving dense correspondence. To ensure interactiveness, we propose a novel correspondence network and show that dense correspondence can be effectively learned by a neural network for procedural models. We also develop several techniques, including a retexturing pipeline to support structural variation from procedural parameters, and part-level UV texture map generation for local appearance editing. Extensive experiments on a diverse set of procedural models validate ProcTex's ability to produce high-quality, visually consistent textures while supporting interactive applications.
翻译:暂无翻译