We consider the problem of reconstructing the signal and the hidden variables from observations coming from a multi-layer network with rotationally invariant weight matrices. The multi-layer structure models inference from deep generative priors, and the rotational invariance imposed on the weights generalizes the i.i.d.\ Gaussian assumption by allowing for a complex correlation structure, which is typical in applications. In this work, we present a new class of approximate message passing (AMP) algorithms and give a state evolution recursion which precisely characterizes their performance in the large system limit. In contrast with the existing multi-layer VAMP (ML-VAMP) approach, our proposed AMP -- dubbed multi-layer rotationally invariant generalized AMP (ML-RI-GAMP) -- provides a natural generalization beyond Gaussian designs, in the sense that it recovers the existing Gaussian AMP as a special case. Furthermore, ML-RI-GAMP exhibits a significantly lower complexity than ML-VAMP, as the computationally intensive singular value decomposition is replaced by an estimation of the moments of the design matrices. Finally, our numerical results show that this complexity gain comes at little to no cost in the performance of the algorithm.
翻译:我们认为重建信号和从多层网络中以旋转不定重量矩阵进行观测而得出的隐藏变量的问题。多层结构模型从深层基因前端推断出多层结构模型,对重力的旋转差异性假设一般化了i.i.d.d.\ Gaussian假设,允许复杂的相关结构,这在应用中是典型的。在这项工作中,我们提出了一个新的类近似信息传递算法,并给出了一种状态演进循环,这确切地说明了其在大系统极限中的性能特征。与现有的多层VAMP(ML-VAMP)方法相比,我们提议的AMP -- -- 以多层旋转为惯性,多层旋转性通用AMP(ML-RI-GAMP) -- -- 提供了超越Gaussian设计之外的自然概括性概括性,其意义是它恢复了现有的Gaussian AMP(AMP)作为特例。此外,ML-RI-GAMMP(ML-VAMP)显示的复杂度远远低于ML-VAM(ML-VAMP)的复杂度,因为计算精度的奇的奇略奇奇奇的奇奇单值的解分数值将取代了我们设计结果的精度,最后的精度的精度是没有数字的复杂度。