This article presents matrix backpropagation algorithms for the QR decomposition of matrices $A_{m, n}$, that are either square (m = n), wide (m < n), or deep (m > n), with rank $k = min(m, n)$. Furthermore, we derive novel matrix backpropagation results for the pivoted (full-rank) QR decomposition and for the LQ decomposition of deep input matrices. Differentiable QR decomposition offers a numerically stable, computationally efficient method to solve least squares problems frequently encountered in machine learning and computer vision. Software implementation across popular deep learning frameworks (PyTorch, TensorFlow, MXNet) incorporate the methods for general use within the deep learning community. Furthermore, this article aids the practitioner in understanding the matrix backpropagation methodology as part of larger computational graphs, and hopefully, leads to new lines of research.
翻译:本文章为 $A ⁇ m, n}$的 QR 解析矩阵的 QR 解析提供了矩阵反向调整算法,这些算法要么平方(m=n),要么宽(m < n),要么深(m > n),等级为$k = min(m, n)$。此外,我们为 QR 分解和深输入矩阵的 LQ 分解得出新的矩阵反向调整结果。可区分的 QR 分解提供了一种数字稳定、计算高效的方法,以解决在机器学习和计算机视觉中经常遇到的最不平方的问题。在流行的深层次学习框架(PyTorrch, TensorFlow, MXNet) 中应用软件, 包括了在深层学习界内通用的方法。此外, 这份文章帮助从业人员理解矩阵反向调整方法,作为更大的计算图表的一部分,希望它能导致新的研究线。