The fully-connected tensor network (FCTN) decomposition has gained prominence in the field of tensor completion owing to its powerful capacity to capture the low-rank characteristics of tensors. Nevertheless, the recovery of local details in the reconstructed tensor still leaves scope for enhancement. In this paper, we propose efficient tensor completion model that incorporates trace regularization within the FCTN decomposition framework. The trace regularization is constructed based on the mode-$k$ unfolding of the FCTN factors combined with periodically modified negative laplacian. The trace regularization promotes the smoothness of the FCTN factors through discrete second-order derivative penalties, thereby enhancing the continuity and local recovery performance of the reconstructed tensor. To solve the proposed model, we develop an efficient algorithm within the proximal alternating minimization (PAM) framework and theoretically prove its convergence. To reduce the runtime of the proposed algorithm, we design an intermediate tensor reuse mechanism that can decrease runtime by 10\%-30\% without affecting image recovery, with more significant improvements for larger-scale data. A comprehensive complexity analysis reveals that the mechanism attains a reduced computational complexity. Numerical experiments demonstrate that the proposed method outperforms existing approaches.
翻译:暂无翻译