Tensor decomposition serves as a powerful primitive in statistics and machine learning. In this paper, we focus on using power iteration to decompose an overcomplete random tensor. Past work studying the properties of tensor power iteration either requires a non-trivial data-independent initialization, or is restricted to the undercomplete regime. Moreover, several papers implicitly suggest that logarithmically many iterations (in terms of the input dimension) are sufficient for the power method to recover one of the tensor components. In this paper, we analyze the dynamics of tensor power iteration from random initialization in the overcomplete regime. Surprisingly, we show that polynomially many steps are necessary for convergence of tensor power iteration to any of the true component, which refutes the previous conjecture. On the other hand, our numerical experiments suggest that tensor power iteration successfully recovers tensor components for a broad range of parameters, despite that it takes at least polynomially many steps to converge. To further complement our empirical evidence, we prove that a popular objective function for tensor decomposition is strictly increasing along the power iteration path. Our proof is based on the Gaussian conditioning technique, which has been applied to analyze the approximate message passing (AMP) algorithm. The major ingredient of our argument is a conditioning lemma that allows us to generalize AMP-type analysis to non-proportional limit and polynomially many iterations of the power method.
翻译:在统计和机器学习中, 天体分解是一个强大的原始元素。 在本文中, 我们侧重于使用电动循环法来分解一个超完全随机电源。 过去研究高电迭代特性的工作要么需要非三重数据初始化, 要么局限于不完全的系统。 此外, 一些论文隐含地表明, 多次对数迭代( 输入维度) 足以让电法恢复一个抗拉元素。 在本文中, 我们分析从超完善的系统随机初始化中分解出超强性能量循环的动态。 令人惊讶的是, 我们显示, 聚合动力迭代功能与任何真实部分的趋同需要多步骤, 这与先前的猜想不相符。 另一方面, 我们的数字实验表明, 极力再调能成功恢复了抗拉动元元元素, 广泛参数的范围范围, 尽管它至少需要多度的多步步。 我们进一步补充了我们的经验证据, 我们证明, 我们应用了一种超完整的目标函数, 也就是我们压变动的动力结构分析 。