The nuclear norm and Schatten-$p$ quasi-norm are popular rank proxies in low-rank matrix recovery. Unfortunately, computing the nuclear norm or Schatten-$p$ quasi-norm of a tensor is NP-hard, which is a pity for low-rank tensor completion (LRTC) and tensor robust principal component analysis (TRPCA). In this paper, we propose a new class of tensor rank regularizers based on the Euclidean norms of the CP component vectors of a tensor and show that these regularizers are monotonic transformations of tensor Schatten-$p$ quasi-norm. This connection enables us to minimize the Schatten-$p$ quasi-norm in LRTC and TRPCA implicitly. The methods do not use the singular value decomposition and hence scale to big tensors. Moreover, the methods are not sensitive to the choice of initial rank and provide an arbitrarily sharper rank proxy for low-rank tensor recovery compared to nuclear norm. On the other hand, we study the generalization abilities of LRTC with Schatten-$p$ quasi-norm regularization and LRTC with our regularizers. The theorems show that a relatively sharper regularizer leads to a tighter error bound, which is consistent with our numerical results. Numerical results on synthetic data and real data demonstrate the effectiveness and superiority of our methods compared to baseline methods.
翻译:核规范 和 Scatten- p$ 准诺尔姆 是低级矩阵回收中受欢迎的等级代名词。 不幸的是,计算核规范或高压准中值的Scatten- p$准中值是NP硬的,这是对低级高分完成和强强强主要成分分析(TRPCA)的遗憾。 在本文中,我们建议根据Euclidean 标准, 高压的CP组件矢量的Euclidean 标准, 建立新的高压级规范, 并表明这些正规化者是Exor Schatten- p$ 准中值的单质转换。 这种连接使我们能够在LRTC 和TRPC 中将半中半中值的纯度- p$准中值降低到最小值。 这种方法并不使用单值分解,因此对大号进行比例分析。 此外,这些方法对最初等级的选择并不敏感, 并且为低级阵列的回收提供了任意的等级。 另一方面,我们研究LRTC 的普及能力, 和最高级的Schreal-ral-ral-ral roal-rupal rodustrate the the the the the rogyal rodustration roduducle rodustrational roduclex the the the the roducal roducal rogyal roducal rod the the the rodal roducal roducal rodu, roducal roducild the the rod the rod the roduclex rod the rod the rodal rodal rodal rodal rodal rodal rodal rodal rodal rodal rodal rodal rodal rodal rodal rodal rodal rodal rodal rodal rodal rodal rodal rodal rodal rodal rodaldaldaldaldaldal rodal rodal rodal