This paper tackles the problem of recovering a low-rank signal tensor with possibly correlated components from a random noisy tensor, or so-called spiked tensor model. When the underlying components are orthogonal, they can be recovered efficiently using tensor deflation which consists of successive rank-one approximations, while non-orthogonal components may alter the tensor deflation mechanism, thereby preventing efficient recovery. Relying on recently developed random tensor tools, this paper deals precisely with the non-orthogonal case by deriving an asymptotic analysis of a parameterized deflation procedure performed on an order-three and rank-two spiked tensor. Based on this analysis, an efficient tensor deflation algorithm is proposed by optimizing the parameter introduced in the deflation mechanism, which in turn is proven to be optimal by construction for the studied tensor model. The same ideas could be extended to more general low-rank tensor models, e.g., higher ranks and orders, leading to more efficient tensor methods with a broader impact on machine learning and beyond.
翻译:----
本文针对从随机噪声张量中恢复可能存在相关组分的低秩信号张量问题进行了研究,即所谓的微弱信号张量模型。当底层组分是正交的时,可以通过张量减法高效恢复,该方法通过依次进行秩一近似来实现,而非正交组分可能会影响张量减法机制,因此妨碍了高效恢复。本文利用最近发展的随机张量工具,针对非正交情况给出了精确处理的方法,通过对由三阶、秩二微弱信号张量上参数化的减法过程进行了渐近分析。基于这个分析,作者提出了一种有效的张量减法算法,该算法通过优化减法机制中引入的参数得到了证明,该参数的构建方式实际上是对所研究微弱信号张量模型的最优情况。上述思想可以扩展到更一般的低秩张量模型,例如更高秩和阶数,从而为机器学习等领域提供更高效的张量方法,具有更广泛的应用。