We study the problem of example-based procedural texture synthesis using highly compact models. Given a sample image, we use differentiable programming to train a generative process, parameterised by a recurrent Neural Cellular Automata (NCA) rule. Contrary to the common belief that neural networks should be significantly over-parameterised, we demonstrate that our model architecture and training procedure allows for representing complex texture patterns using just a few hundred learned parameters, making their expressivity comparable to hand-engineered procedural texture generating programs. The smallest models from the proposed $\mu$NCA family scale down to 68 parameters. When using quantisation to one byte per parameter, proposed models can be shrunk to a size range between 588 and 68 bytes. Implementation of a texture generator that uses these parameters to produce images is possible with just a few lines of GLSL or C code.
翻译:我们用高度紧凑的模型来研究以实例为基础的程序质地合成的问题。 在样本图像中,我们使用不同的编程来训练基因化过程,由反复出现的神经细胞自动模型(NCA)规则作为参数参数。 与神经网络应大大超分的常识相反,我们证明我们的模型结构和培训程序允许使用仅几百个学习的参数来代表复杂的质地模式,使其表达性与手工设计的程序质地生成程序相仿。 从拟议的$\mu$NCA家庭规模到68个参数的最小模型。在使用每个参数的量化到一字节时,提议的模型可以缩到588至68字节的大小。 使用这些参数生成图像的纹理生成器可以用几行GLSL或C代码来实施。