We present AI-SARAH, a practical variant of SARAH. As a variant of SARAH, this algorithm employs the stochastic recursive gradient yet adjusts step-size based on local geometry. AI-SARAH implicitly computes step-size and efficiently estimates local Lipschitz smoothness of stochastic functions. It is fully adaptive, tune-free, straightforward to implement, and computationally efficient. We provide technical insight and intuitive illustrations on its design and convergence. We conduct extensive empirical analysis and demonstrate its strong performance compared with its classical counterparts and other state-of-the-art first-order methods in solving convex machine learning problems.
翻译:我们介绍了ASARAH的实用变体AI-SARAH,作为SARAH的一种变体,这种算法采用随机递归梯度,但根据当地的几何进行分级调整。AI-SARAH暗含地计算了逐步规模并有效地估计了当地Lipschitz的随机功能的顺利性。它完全适应性强、无音、直截了当地执行和计算效率。我们提供了技术洞察和直观地说明其设计和趋同情况。我们进行了广泛的实证分析,并展示了它与古老的对应方和其他最先进的第一级方法相比在解决convex机器学习问题方面的强效表现。