This article proposes new multiplicative updates for nonnegative matrix factorization (NMF) with the $\beta$-divergence objective function. Our new updates are derived from a joint majorization-minimization (MM) scheme, in which an auxiliary function (a tight upper bound of the objective function) is built for the two factors jointly and minimized at each iteration. This is in contrast with the classic approach in which a majorizer is derived for each factor separately. Like that classic approach, our joint MM algorithm also results in multiplicative updates that are simple to implement. They however yield a significant drop of computation time (for equally good solutions), in particular for some $\beta$-divergences of important applicative interest, such as the squared Euclidean distance and the Kullback-Leibler or Itakura-Saito divergences. We report experimental results using diverse datasets: face images, an audio spectrogram, hyperspectral data and song play counts. Depending on the value of $\beta$ and on the dataset, our joint MM approach can yield CPU time reductions from about $13\%$ to $78\%$ in comparison to the classic alternating scheme.
翻译:本条提议对非负矩阵因子化( NMF) 进行新的倍增更新, 使用 $\ beeta$- diggence 客观功能。 我们的新更新来自一个联合主要- 最小化( MM) 方案, 其辅助功能( 目标函数的紧紧上限) 是为两个因素共同构建的, 并在每次迭代中最小化。 这与为每个因素分别生成一个主要要素的经典方法不同。 与这种经典方法一样, 我们的 MM 联合算法也导致多倍增更新, 并且执行起来非常简单。 但是, 它们的计算时间会大幅下降( 对于同样好的解决方案 ), 特别是对于某些具有重要辅助利益的美元- beta- dive 方案, 如平方的 Euclidean 距离和 Kullback- Leiperr 或 Itakura- Saito 差异。 我们用不同的数据集报告实验结果: 脸图像、 声光谱、 超光谱数据和歌曲播放计数。 取决于 $ 和 和 数据设置 $ 和 美元 。