Variational Bayes (VB) is a critical method in machine learning and statistics, underpinning the recent success of Bayesian deep learning. The natural gradient is an essential component of efficient VB estimation, but it is prohibitively computationally expensive in high dimensions. We propose a computationally efficient regression-based method for natural gradient estimation, with convergence guarantees under standard assumptions. The method enables the use of quantum matrix inversion to further speed up VB. We demonstrate that the problem setup fulfills the conditions required for quantum matrix inversion to deliver computational efficiency. The method works with a broad range of statistical models and does not require special-purpose or simplified variational distributions.
翻译:变式贝耶斯(VB)是机器学习和统计的关键方法,是巴伊西亚深层学习最近成功的基础。自然梯度是高效VB估算的基本组成部分,但高尺度的计算成本极高,令人望而却步。我们建议一种基于计算效率的回归法,用于自然梯度估算,并有标准假设下的趋同保证。这种方法使得量子矩阵的反向使用能够进一步加速VB。我们证明设置的问题满足了量子矩阵转换所需的条件,以达到计算效率。这种方法与广泛的统计模型合作,不需要特殊目的或简化的变异分布。