Representing a signal as a continuous function parameterized by neural network (a.k.a. Implicit Neural Representations, INRs) has attracted increasing attention in recent years. Neural Processes (NPs), which model the distributions over functions conditioned on partial observations (context set), provide a practical solution for fast inference of continuous functions. However, existing NP architectures suffer from inferior modeling capability for complex signals. In this paper, we propose an efficient NP framework dubbed Versatile Neural Processes (VNP), which largely increases the capability of approximating functions. Specifically, we introduce a bottleneck encoder that produces fewer and informative context tokens, relieving the high computational cost while providing high modeling capability. At the decoder side, we hierarchically learn multiple global latent variables that jointly model the global structure and the uncertainty of a function, enabling our model to capture the distribution of complex signals. We demonstrate the effectiveness of the proposed VNP on a variety of tasks involving 1D, 2D and 3D signals. Particularly, our method shows promise in learning accurate INRs w.r.t. a 3D scene without further finetuning. Code is available at https://github.com/ZongyuGuo/Versatile-NP .
翻译:以神经网络(a.k.a.a. Inblicent Neural Demotions,INRs)为连续功能参数的信号,近年来日益引起人们的注意。神经过程(NPs)以部分观测为条件,对功能的分布进行模拟,它为快速推断连续功能提供了切实可行的解决办法。但是,现有的NP结构因复杂信号的建模能力低劣而受到影响。在本文件中,我们提议了一个称为Versatile Nealal process(VNP)的高效NP框架,这在很大程度上提高了相近功能的能力。具体地说,我们引入了一个装瓶式编码器,生成了数量较少和内容翔实的上下文符号,减轻了高计算成本,同时提供了高建模能力。在解码方面,我们分级地学习了多种全球潜在变量,共同模拟全球结构和功能的不确定性,使我们的模型能够捕捉到复杂信号的分布。我们展示了拟议的VNPPP在涉及1D、2D和3D信号的各种任务上的有效性。特别是,我们的方法显示在学习准确的IMNPR/D/DRScodestrovs.r.r.