The multinomial logistic regression (MLR) model is widely used in statistics and machine learning. Stochastic gradient descent (SGD) is the most common approach for determining the parameters of a MLR model in big data scenarios. However, SGD has slow sub-linear rates of convergence. A way to improve these rates of convergence is to use manifold optimization. Along this line, stochastic natural gradient descent (SNGD), proposed by Amari, was proven to be Fisher efficient when it converged. However, SNGD is not guaranteed to converge and it is computationally too expensive for MLR models with a large number of parameters. Here, we propose a stochastic optimization method for MLR based on manifold optimization concepts which (i) has per-iteration computational complexity is linear in the number of parameters and (ii) can be proven to converge. To achieve (i) we establish that the family of joint distributions for MLR is a dually flat manifold and we use that to speed up calculations. S\'anchez-L\'opez and Cerquides have recently introduced convergent stochastic natural gradient descent (CSNGD), a variant of SNGD whose convergence is guaranteed. To obtain (ii) our algorithm uses the fundamental idea from CSNGD, thus relying on an independent sequence to build a bounded approximation of the natural gradient. We call the resulting algorithm dual stochastic natural gradient descent (DNSGD). By generalizing a result from Sunehag et al., we prove that DSNGD converges. Furthermore, we prove that the computational complexity of DSNGD iterations are linear on the number of variables of the model.
翻译:多重物流回归( MLR) 模式在统计和机器学习中被广泛使用。 在大数据情景中, Stochatic 梯度下降(SGD) 是确定 MLR 模型参数的最常见方法。 然而, SGD 具有缓慢的亚线性趋同率。 提高这些趋同率的方法是使用多重优化。 在这条线上, 由 Amari 提议的随机自然梯度下降( SNGD ) 已证明是有效的。 然而, SNGD 不保证会汇合, 并且对于具有大量参数的MLR模型来说, 计算成本太昂贵。 在这里, 我们建议基于 多重优化概念的 MLRLM 模型采用 Stochest 优化方法, 其(i) 其每升计算精度计算精度计算精度的精度的精度的精度。 Snal-LPlational, 其SNGD 的精度递性递增缩缩缩缩缩( C), 其自然递缩缩缩缩缩缩缩(CGNG) 的精度的精度的精度的精度最终变。