Linear regression is a widely used technique to fit linear models and finds widespread applications across different areas such as machine learning and statistics. In most real-world scenarios, however, linear regression problems are often ill-posed or the underlying model suffers from overfitting, leading to erroneous or trivial solutions. This is often dealt with by adding extra constraints, known as regularization. In this paper, we use the frameworks of block-encoding and quantum singular value transformation to design new quantum algorithms for quantum least squares with general $\ell_2$-regularization. These include regularized versions of quantum ordinary least squares, quantum weighted least squares, and quantum generalized least squares. Our quantum algorithms substantially improve upon prior results on quantum ridge regression (the regularization term is proportional to the $\ell_2$-norm of the least squares solution), which is a particular case of our result. To this end, we assume approximate block-encodings of the underlying matrices as input and use robust QSVT algorithms for various linear algebra operations. In particular, we develop a variable-time quantum algorithm for matrix inversion using QSVT, where we use quantum eigenvalue discrimination as a subroutine instead of gapped phase estimation. This ensures that fewer additional qubits are required for this procedure than prior results. Owing to the generality of the block-encoding framework, our algorithms can be implemented on a variety of input models and can also be seen as improved and generalized versions of the standard (non-regularized) quantum least squares algorithms of Chakraborty et al. [ICALP 2019].
翻译:线性回归是一种广泛应用的技术,用于适应线性模型,并发现在机器学习和统计等不同领域广泛应用。然而,在大多数现实世界情景中,线性回归问题往往被错误掩盖,或者基本模型存在过大的问题,导致错误或微不足道的解决办法。这通常通过增加额外的限制来加以解决,称为正规化。在本文中,我们使用区块编码框架和数量单值转换框架来设计具有一般值的最小量方的新量衡算法,用于一般值为$\ell_2美元正常化。其中包括普通普通量的普通方块、加权的最小方块和定量的最小方块。在之前的量值回归结果中,我们的量值算法大大改进(正规值与最小方块溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液溶液),我们假定基本基质基质基质的大致结构的粗化是投入的输入,在各种直线性变压变压变压变压变压值中可以使用强大的QSVTVTyl态变法化。我们使用这种变压解的变压解的变压解法的变压压解法的基底变的变的变的变的变的变压法,这是一种变的变的变的变压法,我们算法,我们算法的变的变的变的变的变的变的变的变的变的变的变的变的变的变的变的变的变的变的变的变的变的变的变的变的变制的变的变制的变的变制的变的变的变制的变的变的变的变的变法,我们算法,我们用的变的变制的变制的变制的变制的变制的变的变制的变的变的变制的变的变的变的变的变的变的变的变制的变的变的变的变制的变制的变的变的变的变的变的变的变的变的变的变制的变