Stochastic majorization-minimization (SMM) is an online extension of the classical principle of majorization-minimization, which consists of sampling i.i.d. data points from a fixed data distribution and minimizing a recursively defined majorizing surrogate of an objective function. In this paper, we introduce stochastic block majorization-minimization, where the surrogates can now be only block multi-convex and a single block is optimized at a time within a diminishing radius. Relaxing the standard strong convexity requirements for surrogates in SMM, our framework gives wider applicability including online CANDECOMP/PARAFAC (CP) dictionary learning and yields greater computational efficiency especially when the problem dimension is large. We provide an extensive convergence analysis on the proposed algorithm, which we derive under possibly dependent data streams, relaxing the standard i.i.d. assumption on data samples. We show that the proposed algorithm converges almost surely to the set of stationary points of a nonconvex objective under constraints at a rate $O((\log n)^{1+\eps}/n^{1/2})$ for the empirical loss function and $O((\log n)^{1+\eps}/n^{1/4})$ for the expected loss function, where $n$ denotes the number of data samples processed. Under some additional assumption, the latter convergence rate can be improved to $O((\log n)^{1+\eps}/n^{1/2})$. Our results provide first convergence rate bounds for various online matrix and tensor decomposition algorithms under a general Markovian data setting.
翻译:(SMM) 是传统主要化-最小化原则(SMM) 的在线扩展, 包括来自固定数据分布的数据点的抽样一. d. 数据点, 并最大限度地减少对目标函数的递归性化主要代谢。 在本文中, 我们引入了随机切分块主要化-最小化( SMM ), 代孕器现在只能成为块状多孔化, 并在一个越来越小的半径范围内优化单一区块 。 放松了对 SMM 代孕( SM ) 的标准强烈一致化要求, 我们的框架具有更广泛的适用性, 包括在线 CANDECOMP/ PARAFAC (CP) 字典学习并产生更高的计算效率, 特别是在问题层面大的情况下。 我们对拟议算法进行了广泛的趋同分析, 我们根据数据流得出的可能依赖于数据流, 放松标准i. d. 的假设。 我们显示, 拟议的算法几乎可以与非孔化矩阵的立点组合值组合值组合值组合, 在一些( $ n) 1\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\