Neural Radiance Fields (NeRFs) are coordinate-based implicit representations of 3D scenes that use a differentiable rendering procedure to learn a representation of an environment from images. This paper extends NeRFs to handle dynamic scenes in an online fashion. We do so by introducing a particle-based parametric encoding, which allows the intermediate NeRF features -- now coupled to particles in space -- to be moved with the dynamic geometry. We backpropagate the NeRF's photometric reconstruction loss into the position of the particles in addition to the features they are associated with. The position gradients are interpreted as particle velocities and integrated into positions using a position-based dynamics (PBS) physics system. Introducing PBS into the NeRF formulation allows us to add collision constraints to the particle motion and creates future opportunities to add other movement priors into the system such as rigid and deformable body constraints. We show that by allowing the features to move in space, we incrementally adapt the NeRF to the changing scene.
翻译:以协调为基础的三维场景( Neoral Radiance Fields (NeRFs) 是基于协调的三维场景的隐含表示, 3D场景使用不同的设定程序从图像中学习环境的表示。 本文扩展了 NERFs, 以在线方式处理动态场景。 我们这样做的方式是引入基于粒子的参数编码, 使中间的 NERF 特征 -- -- 现在与空间中的颗粒相伴 -- -- 能够与动态几何法一起移动。 我们除了与它们相关的特征外, 将 NERF 光度重建损失反向到粒子的位置。 位置梯度被解释为粒子速度, 并融入到使用基于位置的动态物理系统( PPBS) 的位置中。 在 NERF 配制中引入 PBS, 允许我们增加粒子运动的碰撞限制, 并创造未来的机会, 将其他运动前期( 如僵硬体和变形体约束) 。 我们显示, 通过允许这些特征在空间移动, 我们逐渐将 NERF 适应变化的场景象 。