The nuclear norm and Schatten-$p$ quasi-norm of a matrix are popular rank proxies in low-rank matrix recovery. Unfortunately, computing the nuclear norm or Schatten-$p$ quasi-norm of a tensor is NP-hard, which is a pity for low-rank tensor completion (LRTC) and tensor robust principal component analysis (TRPCA). In this paper, we propose a new class of rank regularizers based on the Euclidean norms of the CP component vectors of a tensor and show that these regularizers are monotonic transformations of tensor Schatten-$p$ quasi-norm. This connection enables us to minimize the Schatten-$p$ quasi-norm in LRTC and TRPCA implicitly. The methods do not use the singular value decomposition and hence scale to big tensors. Moreover, the methods are not sensitive to the choice of initial rank and provide an arbitrarily sharper rank proxy for low-rank tensor recovery compared to nuclear norm. We provide theoretical guarantees in terms of recovery error for LRTC and TRPCA, which show relatively smaller $p$ of Schatten-$p$ quasi-norm leads to tighter error bounds. Experiments using LRTC and TRPCA on synthetic data and natural images verify the effectiveness and superiority of our methods compared to baseline methods.
翻译:在本文中,我们建议根据Euclidean 标准,在低端基质回收中采用新型等级规范,并表明这些规范化者是Exron Schatten-$p$准诺姆的单质转换。 这种连接使我们能够在LRTC和TRPCA中以隐含的方式最大限度地减少Schatten-p$准诺姆,这是对低级高端完成(LRTC)和强强度主元件分析(TRPCA)的遗憾。在本文中,我们建议根据Euclidean 标准,采用新的等级规范化标准,根据Euclidean 标准,为低级高压矢量的矢量回收提供武断的等级替代,并表明这些规范化者在回收错误方面提供了理论上的保证,即以LRTC$-$-P$准诺姆的准诺尔本值,以及使用比较小的TRCA-RP-RP-CA 和TRP-RP-RP-CA的精确级模型,这显示了相对小于RRT-C-C-C-CRC-CRCRCRP-C-CRC-C-C-C-C-C-C-C-C-C-C-C-CRest-C-C-C-C-C-C-C-C-C-C-C-C-C-C-CRAC-C-C-C-C-C-C-C-C-C-C-C-C-C-C-CRestrentrentral-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-C-Sir-Sir-Sir-Sir 的回收的回收方法的回收方法的回收方法的回收错误的回收错误的回收错误方法的回收方法的回收错误的回收错误方法的回收错误的回收错误的回收错误的回收错误和近