We present a novel approach for black-box VI that bypasses the difficulties of stochastic gradient ascent, including the task of selecting step-sizes. Our approach involves using a sequence of sample average approximation (SAA) problems. SAA approximates the solution of stochastic optimization problems by transforming them into deterministic ones. We use quasi-Newton methods and line search to solve each deterministic optimization problem and present a heuristic policy to automate hyperparameter selection. Our experiments show that our method simplifies the VI problem and achieves faster performance than existing methods.
翻译:我们提出了一种新的黑盒变分推断方法,可以避开随机梯度上升的困难,包括选择步长的任务。我们的方法涉及使用一系列样本平均逼近(SAA)问题。SAA通过将随机优化问题转化为确定性问题来逼近随机优化问题的解。我们使用拟牛顿方法和线性搜索来解决每个确定性优化问题,并提出一种启发式策略来自动选择超参数。我们的实验表明,我们的方法简化了变分推断问题,并且比现有方法实现了更快的性能。