Differentiable rasterization changes the common formulation of primitive rasterization -- which has zero gradients almost everywhere, due to discontinuous edges and occlusion -- to an alternative one, which is not subject to this limitation and has similar optima. These alternative versions in general are ''soft'' versions of the original one. Unfortunately, it is not clear, what exact way of softening will provide the best performance in terms of converging the most reliability to a desired goal. Previous work has analyzed and compared several combinations of softening. In this work, we take it a step further and, instead of making a combinatorical choice of softening operations, parametrize the continuous space of all softening operations. We study meta-learning a parametric S-shape curve as well as an MLP over a set of inverse rendering tasks, so that it generalizes to new and unseen differentiable rendering tasks with optimal softness.
翻译:由于不连续的边缘和封闭性,原始摇篮化的通用配方几乎无坡度,几乎无处不在,可区别的摇篮化改变了原始摇篮化的通用配方,而是一种替代的配方,不受这种限制,具有类似的选择性。这些替代版本一般是原始的“软”版本。不幸的是,不清楚的是,在将最可靠程度与预期目标相融合方面,什么确切的软化方式能提供最佳的性能。以前的工作分析和比较了软化的几种组合。在这项工作中,我们更进一步地选择了软化操作的组合选择,而不是对所有软化操作的连续空间进行配对。我们研究的是元模型学的参数S形状曲线和MLP,以及一组反向转换任务,这样它就可以将新的和看不见的不同任务与最软化的组合概括化。