We give a classical algorithm for linear regression analogous to the quantum matrix inversion algorithm [Harrow, Hassidim, and Lloyd, Physical Review Letters'09, arXiv:0811.3171] for low-rank matrices [Wossnig, Zhao, and Prakash, Physical Review Letters'18, arXiv:1704.06174], when the input matrix $A$ is stored in a data structure applicable for QRAM-based state preparation. Namely, suppose we are given an $A \in \mathbb{C}^{m\times n}$ with minimum non-zero singular value $\sigma$ which supports certain efficient $\ell_2$-norm importance sampling queries, along with a $b \in \mathbb{C}^m$. Then, for some $x \in \mathbb{C}^n$ satisfying $\|x - A^+b\| \leq \varepsilon\|A^+b\|$, we can output a measurement of $|x\rangle$ in the computational basis and output an entry of $x$ with classical algorithms that run in $\tilde{\mathcal{O}}\big(\frac{\|A\|_{\mathrm{F}}^6\|A\|^6}{\sigma^{12}\varepsilon^4}\big)$ and $\tilde{\mathcal{O}}\big(\frac{\|A\|_{\mathrm{F}}^6\|A\|^2}{\sigma^8\varepsilon^4}\big)$ time, respectively. This improves on previous "quantum-inspired" algorithms in this line of research by at least a factor of $\frac{\|A\|^{16}}{\sigma^{16}\varepsilon^2}$ [Chia, Gily\'en, Li, Lin, Tang, and Wang, STOC'20, arXiv:1910.06151]. As a consequence, we show that quantum computers can achieve at most a factor-of-12 speedup for linear regression in this QRAM data structure setting and related settings. Our work applies techniques from sketching algorithms and optimization to the quantum-inspired literature. Unlike earlier works, this is a promising avenue that could lead to feasible implementations of classical regression in a quantum-inspired settings, for comparison against future quantum computers.
翻译:我们给出了类似于量子矩阵转换算法的线性回归古典算法。 也就是说, 当输入矩阵存储于一个适用于 QRAM 的国家准备的数据结构中时, 我们将会得到一个 $A\in, {Hassidim, 和Lloyd, 物理审查RebetionRefor'09, arxiv: 08113171] 用于低级矩阵 [Wossnig, 赵, 和 Prakash, 物理审查 Reference Retress'18, arxiv: 1704. 06. 美元 当输入矩阵存储于一个适用于 QRAM 的国家准备的数据结构中时 。 也就是说, 我们得到一个 $A_\\\\\\\\\\ listivil listal_ ligal_ dirmaxal_ a maxal_ maxal_ maxal_ a maxil_ a maxal_ a maxal_ a max maxal max max maxal_ a maxal_ max_ max_ max_ max_ max_ max_ max_ max_ max_ max_ max_ max_ maxx_ max_ max_ max_ max_ max_ max_ maxx_ maxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx