Neural implicit functions are highly effective for data representation. However, the implicit functions learned by neural networks usually include unexpected noisy artifacts or lose fine details if the input data has many scales of detail or contains both low-frequency and high-frequency bandwidths. Removing artifacts while preserving fine-scale contents is challenging and usually comes out with over-smoothing or noisy issues. To solve this dilemma, we propose a new framework (FINN) that integrates a filtering module into the MLPs to perform data reconstruction while adapting regions containing different frequencies. The filtering module has a smoothing operator acting on intermediate results of the network that encourages the results to be smooth and a recovering operator bringing high frequencies to regions overly smooth. The two counteractive operators play consecutively in all MLP layers to adaptively influence the reconstruction. We demonstrate the advantage of FINN on several tasks and showcase significant improvement compared to state-of-the-art methods. In addition, FINN also yields better performance in both convergence speed and network stability.
翻译:然而,神经网络所学的隐含功能通常包括出乎意料的噪音文物,或者如果输入数据具有许多详细程度或包含低频和高频带宽度,则会丢失细细细节。在保存细度内容的同时清除文物具有挑战性,通常会出现过度移动或吵闹的问题。为了解决这一难题,我们提议一个新的框架(FINN),将过滤模块纳入MLP,以进行数据重建,同时调整含有不同频率的区域。过滤模块有一个顺畅的操作器,在网络的中间结果上采取行动,鼓励结果平稳,恢复操作器将高频率带给区域,两个反动操作器在所有MLP层连续运行,以适应性地影响重建。我们展示了FINN在几项任务上的优势,并展示了与最新方法相比的重大改进。此外,FINN在趋同速度和网络稳定性两方面都取得了更好的业绩。