Implicit neural representation (INR) can describe the target scenes with high fidelity using a small number of parameters, and is emerging as a promising data compression technique. However, INR in intrinsically of limited spectrum coverage, and it is non-trivial to remove redundancy in diverse complex data effectively. Preliminary studies can only exploit either global or local correlation in the target data and thus of limited performance. In this paper, we propose a Tree-structured Implicit Neural Compression (TINC) to conduct compact representation for local regions and extract the shared features of these local representations in a hierarchical manner. Specifically, we use MLPs to fit the partitioned local regions, and these MLPs are organized in tree structure to share parameters according to the spatial distance. The parameter sharing scheme not only ensures the continuity between adjacent regions, but also jointly removes the local and non-local redundancy. Extensive experiments show that TINC improves the compression fidelity of INR, and has shown impressive compression capabilities over commercial tools and other deep learning based methods. Besides, the approach is of high flexibility and can be tailored for different data and parameter settings. All the reproducible codes are going to be released on github.
翻译:内隐隐性神经代表(INR)可以使用少量参数来描述目标场景,并作为一种很有希望的数据压缩技术。然而,内隐性内隐性内隐性内隐性内隐性内隐性内隐性内隐性内隐性内隐性内含性(INR)可以使用少量参数来描述目标场景,并且正在作为一种有希望的数据压缩技术出现。然而,内隐性内隐性内隐性内隐性内隐性内隐性内隐性内隐性内含性(INR)在本质上是有限的频谱性内内隐性内隐性内隐性内含性内含性(INR),而内隐性内含性内含性内含性内含性内含性内含性内含性内性内含性内性内含性内含性内含性内分级,而内含内含内含内含内含性内含性内含性外性外性内含性内含性内性内含性内性内性内性,而内含性内性内含性内含性内性内性内性内性内性内性内性内性内性内含性内含性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性内性