Learning neural fields has been an active topic in deep learning research, focusing, among other issues, on finding more compact and easy-to-fit representations. In this paper, we introduce a novel low-rank representation termed Tensor Train Neural Fields (TT-NF) for learning neural fields on dense regular grids and efficient methods for sampling from them. Our representation is a TT parameterization of the neural field, trained with backpropagation to minimize a non-convex objective. We analyze the effect of low-rank compression on the downstream task quality metrics in two settings. First, we demonstrate the efficiency of our method in a sandbox task of tensor denoising, which admits comparison with SVD-based schemes designed to minimize reconstruction error. Furthermore, we apply the proposed approach to Neural Radiance Fields, where the low-rank structure of the field corresponding to the best quality can be discovered only through learning.
翻译:在深层学习研究中,学习神经领域一直是一个积极的课题,重点是寻找更紧凑、更方便的表达方式。在本文中,我们引入了一种新型的低级代表方式,称为Tensor 培训神经领域(TT-NF),用于在密集的常规电网中学习神经领域,并采用有效的取样方法。我们的表述方式是神经领域的TT参数化,经过反向调整培训,以尽量减少非凝固目标。我们分析了低级压缩对两种环境下游任务质量指标的影响。首先,我们展示了我们的方法在沙箱的高温脱鼻任务中的效率,这一任务承认了与旨在尽量减少重建错误的SVD计划进行比较。此外,我们对神经辐射领域采用了拟议的方法,在那里,只有通过学习才能发现与最佳质量相对应的低级字段结构。