Tensor decomposition is a fundamentally challenging problem. Even the simplest case of tensor decomposition, the rank-1 approximation in terms of the Least Squares (LS) error, is known to be NP-hard. Here, we show that, if we consider the KL divergence instead of the LS error, we can analytically derive a closed form solution for the rank-1 tensor that minimizes the KL divergence from a given positive tensor. Our key insight is to treat a positive tensor as a probability distribution and formulate the process of rank-1 approximation as a projection onto the set of rank-1 tensors. This enables us to solve rank-1 approximation by convex optimization. We empirically demonstrate that our algorithm is an order of magnitude faster than the existing rank-1 approximation methods and gives better approximation of given tensors, which supports our theoretical finding.
翻译:Tensor 分解是一个具有根本挑战性的问题。 即使是最简单的高压分解案例, 最低方块错误的等级-1近似值, 也是已知的NP- 硬。 我们在这里表明, 如果我们考虑KL差异而不是 LS 错误, 我们可以分析为等级-1 分解物找到封闭的形式解决方案, 从而将给定正向分解的 KL差异最小化。 我们的关键洞察力是将正向分解作为概率分布, 并将第一级近近似进程设计成对级-1 10 集的投影。 这使我们能够通过配置方块优化解决第一级近近似值。 我们从经验上证明, 我们的算法比现有的等级-1 近似法要快得多, 并且给给给定的分压器提供更好的近似值, 支持我们的理论发现 。