Tensor completion refers to the task of estimating the missing data from an incomplete measurement or observation, which is a core problem frequently arising from the areas of big data analysis, computer vision, and network engineering. Due to the multidimensional nature of high-order tensors, the matrix approaches, e.g., matrix factorization and direct matricization of tensors, are often not ideal for tensor completion and recovery. In this paper, we introduce a unified low-rank and sparse enhanced Tucker decomposition model for tensor completion. Our model possesses a sparse regularization term to promote a sparse core tensor of the Tucker decomposition, which is beneficial for tensor data compression. Moreover, we enforce low-rank regularization terms on factor matrices of the Tucker decomposition for inducing the low-rankness of the tensor with a cheap computational cost. Numerically, we propose a customized ADMM with enough easy subproblems to solve the underlying model. It is remarkable that our model is able to deal with different types of real-world data sets, since it exploits the potential periodicity and inherent correlation properties appeared in tensors. A series of computational experiments on real-world data sets, including internet traffic data sets, color images, and face recognition, demonstrate that our model performs better than many existing state-of-the-art matricization and tensorization approaches in terms of achieving higher recovery accuracy.
翻译:Tensor 完成时指的是从不完全的测量或观测中估计缺失的数据的任务,这是大数据分析、计算机视觉和网络工程领域经常产生的一个核心问题。由于高阶高压分解器的多层面性质,矩阵方法,例如矩阵因子化和直接数学化,往往不理想于超压完成和恢复。在本文中,我们引入了一个统一的低级和稀薄的塔克强化分解模型,以完成高压。我们的模型拥有一个稀薄的正规化术语,以推广塔克分解器的稀薄核心粒子,这有利于压压压压数据。此外,我们还在塔克分解器的因子矩阵中执行低等级的正规化条件,以廉价计算成本诱导加压压器的低等级。从数量上看,我们建议一个定制的ADMMM(ADM),其子质质能足够容易解决基本模型。我们模型能够处理不同类型的真实世界数据集,因为它利用了古老的周期和内在关联性特性,有利于压压压数据压缩。一系列的计算模型实验,在现实世界数据集上进行。