This article presents matrix backpropagation algorithms for the QR decomposition of matrices $A_{m,n}$, that are either square (m = n), wide (m < n), or deep (m > n), with rank $k = min(m, n)$. Furthermore, we derive a novel matrix backpropagation result for the LQ decomposition for deep input matrices. Differentiable QR decomposition offers a numerically stable, computationally efficient method to solve least squares problems frequently encountered in machine learning and computer vision. Software implementation across popular deep learning frameworks (PyTorch, TensorFlow, MXNet) incorporate the methods for general use within the deep learning community. Furthermore, this article aids the practitioner in understanding the matrix backpropagation methodology as part of larger computational graphs, and hopefully, leads to new lines of research.
翻译:此外,我们为LQ深层输入矩阵的分解得出了一个新的矩阵反演结果。不同的QR分解提供了一种数字稳定、计算高效的方法,以解决在机器学习和计算机视觉中经常遇到的最不平方的问题。在广度(m)=n)或深度(m)的平方(m)或深度(m)(m)为美元=m(m)=m(n)$。此外,我们为LQ深层输入矩阵分解得出了一个新型矩阵反演结果。不同的QR分解提供了一种数字稳定、计算高效的方法,解决在机器学习和计算机视觉中经常遇到的最不平方的问题。在流行的深层学习框架中(PyTorrch、TensorFlow、MXNet)的软件应用包含了在深层学习界内普遍使用的方法。此外,这篇文章还帮助从业人员理解矩阵反演化方法,作为更大的计算图的一部分,并有望导致新的研究线。