We study the best low-rank Tucker decomposition of symmetric tensors, advocating a straightforward projected gradient descent (PGD) method for its computation. The main application of interest is in decomposing higher-order multivariate moments, which are symmetric tensors. We develop scalable adaptations of the basic PGD method and higher-order eigenvalue decomposition (HOEVD) to decompose sample moment tensors. With the help of implicit and streaming techniques, we evade the overhead cost of building and storing the moment tensor. Such reductions make computing the Tucker decomposition realizable for large data instances in high dimensions. Numerical experiments demonstrate the efficiency of the algorithms and the applicability of moment tensor decompositions to real-world datasets. Last, we study the convergence on the Grassmannian manifold, and prove that the update sequence derived by the PGD solver achieves first and second-order criticality.
翻译:我们研究的是极低的塔克对正对数分解法,倡导一种直截了当的预测梯度下降法(PGD)进行计算。主要应用是分解高阶多变时,即对称数乘数。我们开发了基本PGD法和高阶双元分解法可缩放的适应法,以分解试样瞬时分解。在隐含和流化技术的帮助下,我们避开了建造和储存瞬时的间接成本。这种减少使得计算塔克分解能为高维度的大数据情况实现。数字实验显示了算法的效率以及时高压分解对真实世界数据集的可适用性。最后,我们研究了格拉斯曼元的趋同,并证明PGD解码器所得出的更新序列达到了第一和第二级临界性。