Neural fields are evolving towards a general-purpose continuous representation for visual computing. Yet, despite their numerous appealing properties, they are hardly amenable to signal processing. As a remedy, we present a method to perform general continuous convolutions with general continuous signals such as neural fields. Observing that piecewise polynomial kernels reduce to a sparse set of Dirac deltas after repeated differentiation, we leverage convolution identities and train a repeated integral field to efficiently execute large-scale convolutions. We demonstrate our approach on a variety of data modalities and spatially-varying kernels.
翻译:神经场正在向视觉计算的通用连续表示发展,但尽管它们具有许多吸引人的特性,但它们很难进行信号处理。作为解决方法,我们提出了一种方法,用于对神经场等一般连续信号进行通用连续卷积。我们观察到,分段多项式核在经过重复微分后会减少到一组Dirac delta函数,利用卷积恒等式并训练重复积分场从而有效地执行大规模卷积。我们在各种数据模态和空间变化的内核上演示了我们的方法。