Incremental Expectation Maximization (EM) algorithms were introduced to design EM for the large scale learning framework by avoiding the full data set to be processed at each iteration. Nevertheless, these algorithms all assume that the conditional expectations of the sufficient statistics are explicit. In this paper, we propose a novel algorithm named Perturbed Prox-Preconditioned SPIDER (3P-SPIDER), which builds on the Stochastic Path Integral Differential EstimatoR EM (SPIDER-EM) algorithm. The 3P-SPIDER algorithm addresses many intractabilities of the E-step of EM; it also deals with non-smooth regularization and convex constraint set. Numerical experiments show that 3P-SPIDER outperforms other incremental EM methods and discuss the role of some design parameters.
翻译:采用递增预期最大化算法来设计大型学习框架的EM,避免在每次迭代中处理全部数据集,但这些算法都假定充分统计数据的有条件期望是明确的。在本文件中,我们提出一个名为Perturbed Prox-Premented Plax-PIDER(3P-SPIDR)的新型算法,该算法以存储路径综合差异估计R EM(SPIDER-EM)算法为基础。3P-SP-SPIDR算法处理EME步骤的许多易感性;它也涉及非移动的正规化和设定的 convex限制。数字实验表明,3P-SP-SPIDER优于其他渐进式EM方法,并讨论某些设计参数的作用。