Image warping aims to reshape images defined on rectangular grids into arbitrary shapes. Recently, implicit neural functions have shown remarkable performances in representing images in a continuous manner. However, a standalone multi-layer perceptron suffers from learning high-frequency Fourier coefficients. In this paper, we propose a local texture estimator for image warping (LTEW) followed by an implicit neural representation to deform images into continuous shapes. Local textures estimated from a deep super-resolution (SR) backbone are multiplied by locally-varying Jacobian matrices of a coordinate transformation to predict Fourier responses of a warped image. Our LTEW-based neural function outperforms existing warping methods for asymmetric-scale SR and homography transform. Furthermore, our algorithm well generalizes arbitrary coordinate transformations, such as homography transform with a large magnification factor and equirectangular projection (ERP) perspective transform, which are not provided in training.
翻译:图像扭曲的目的是将矩形网格上定义的图像重塑为任意形状。 最近, 隐含的神经功能以连续方式代表图像表现出显著的性能。 但是, 一个独立的多层光谱因学习高频 Fourier 系数而受到影响。 在本文中, 我们提议了本地图像扭曲纹理估计仪( LTEW), 之后以隐含的神经表示方式将图像变形为连续形状。 从深层超分辨率(SR) 脊椎中估算出来的本地纹理, 由本地变异的Jacobian 矩阵乘以一个坐标变形, 以预测扭曲图像的 Fourier 反应。 我们基于 LTEW 的神经功能优于现有不对称比例的SR 和同系变形的扭曲方法。 此外, 我们的算法将任意协调变形( 例如带有大型放大系数的同系变形和直角投影法) 观点变形( 在培训中没有提供 ) 。