Stochastic model-based methods have received increasing attention lately due to their appealing robustness to the stepsize selection and provable efficiency guarantee for non-smooth non-convex optimization. To further improve the performance of stochastic model-based methods, we make two important extensions. First, we propose a new minibatch algorithm which takes a set of samples to approximate the model function in each iteration. For the first time, we show that stochastic algorithms achieve linear speedup over the batch size even for non-smooth and non-convex problems. To this end, we develop a novel sensitivity analysis of the proximal mapping involved in each algorithm iteration. Our analysis can be of independent interests in more general settings. Second, motivated by the success of momentum techniques for convex optimization, we propose a new stochastic extrapolated model-based method to possibly improve the convergence in the non-smooth and non-convex setting. We obtain complexity guarantees for a fairly flexible range of extrapolation term. In addition, we conduct experiments to show the empirical advantage of our proposed methods.
翻译:最近,由于对非悬浮非convex优化的分级选择和可测效率保障具有吸引力,基于存储模型的方法最近受到越来越多的关注。为了进一步改进基于随机模型的方法的性能,我们做了两个重要的扩展。首先,我们提出一个新的微型批量算法,采用一套样本来近似每个迭代中的模型功能。我们第一次表明,即使非吸附和非凝聚问题,基于蒸馏算法的批量规模也实现了线性加速。为此,我们对每种算法迭代法所涉的精度绘图进行了新颖的敏感性分析。我们的分析可以在更普遍的环境下产生独立的兴趣。第二,由于凝聚优化的势头技术的成功,我们提出了一种新的基于随机外推法的模式方法,以便有可能改进非吸附和非凝聚设置中的趋同性。我们为范围较灵活的外推法术语获得复杂保证。此外,我们还进行了实验,以展示我们拟议方法的经验优势。