Variational Bayes (VB) is a critical method in machine learning and statistics, underpinning the recent success of Bayesian deep learning. The natural gradient is an essential component of efficient VB estimation, but it is prohibitively computationally expensive in high dimensions. We propose a hybrid quantum-classical algorithm to improve the scaling properties of natural gradient computation and make VB a truly computationally efficient method for Bayesian inference in highdimensional settings. The algorithm leverages matrix inversion from the linear systems algorithm by Harrow, Hassidim, and Lloyd [Phys. Rev. Lett. 103, 15 (2009)] (HHL). We demonstrate that the matrix to be inverted is sparse and the classical-quantum-classical handoffs are sufficiently economical to preserve computational efficiency, making the problem of natural gradient for VB an ideal application of HHL. We prove that, under standard conditions, the VB algorithm with quantum natural gradient is guaranteed to converge.
翻译:VB是机器学习和统计的关键方法,是巴伊西亚深层学习最近成功的基础。自然梯度是高效VB估算的基本组成部分,但高尺度的计算成本却令人望而却步。我们建议采用混合量子古典算法,以改善自然梯度计算的规模特性,并使VB成为高维环境中巴伊西亚推理的真正计算高效方法。算法杠杆矩阵将哈罗、哈西迪姆和劳埃德[Phys.Rev. Lett. 103, 15(2009)](HHHL)的线性系统算法转换。我们证明,要倒置的矩阵是稀少的,古典-量子古典古典古典的手法非常经济,可以保持计算效率,使VB的自然梯度问题成为HL的理想应用。我们证明,在标准条件下,具有量子自然梯度的VB算法保证会趋同。