We present an efficient low-rank approximation algorithm for non-negative tensors. The algorithm is derived from our two findings: First, we show that rank-1 approximation for tensors can be viewed as a mean-field approximation by treating each tensor as a probability distribution. Second, we theoretically provide a sufficient condition for distribution parameters to reduce Tucker ranks of tensors and, interestingly, this sufficient condition can be achieved by iterative application of the mean-field approximation. Since the mean-field approximation is always given as a closed formula, our findings lead to a fast low-rank approximation algorithm without using a gradient method. We empirically demonstrate that our algorithm is faster than the existing non-negative Tucker rank reduction methods with achieving competitive or better approximation of given tensors.
翻译:我们为非负向性抗拉提供了一种高效的低级别近似算法。这种算法来自我们的两个发现:首先,我们通过将每个抗拉作为概率分布处理,表明对抗拉的一等近近近似值可被视为一种中位近近近近。第二,我们理论上为分配参数提供了充分的条件,以减少塔克的抗拉等级,而且有趣的是,通过迭接应用中位近近近近法可以达到这一足够条件。由于中位近近似总是作为一种封闭式给予的,因此,我们的发现导致一种不使用梯度方法的快速低级别近近近近算法。我们从经验上证明,我们的算法比现有的非负级塔克降级方法更快,实现了有竞争力的或更好的特定抗拉近近法。