This paper tackles the problem of recovering a low-rank signal tensor with possibly correlated components from a random noisy tensor, or so-called spiked tensor model. When the underlying components are orthogonal, they can be recovered efficiently using tensor deflation which consists of successive rank-one approximations, while non-orthogonal components may alter the tensor deflation mechanism, thereby preventing efficient recovery. Relying on recently developed random tensor tools, this paper deals precisely with the non-orthogonal case by deriving an asymptotic analysis of a parameterized deflation procedure performed on an order-three and rank-two spiked tensor. Based on this analysis, an efficient tensor deflation algorithm is proposed by optimizing the parameter introduced in the deflation mechanism, which in turn is proven to be optimal by construction for the studied tensor model. The same ideas could be extended to more general low-rank tensor models, e.g., higher ranks and orders, leading to more efficient tensor methods with a broader impact on machine learning and beyond.
翻译:本文解决了回收低声信号振幅的问题,其中可能包含来自随机噪音振幅,或所谓的振幅式振幅式振幅模型的相联组件。 当基本部件是正向式的时, 可以用由连续一级近似组成的超速通缩来有效恢复它们, 而非正向式组件则可以改变超压通缩机制, 从而阻止有效恢复。 依靠最近开发的随机振幅工具, 本文通过对在按订单三和按订单二级加压的高压中执行的参数化通缩程序进行无序分析, 从而精确地处理非正向式的个案。 基于这一分析, 通过优化通缩机制中引入的参数来提出高效的拉动通缩算法, 而这反过来通过构建所研究的高压模型来证明是最佳的。 同样的想法可以扩展至更普遍的低压型模型, 比如, 更高级和订单, 导致对机器学习及以后产生更广泛影响的更高效的发速方法。