Unrolled neural networks have recently achieved state-of-the-art accelerated MRI reconstruction. These networks unroll iterative optimization algorithms by alternating between physics-based consistency and neural-network based regularization. However, they require several iterations of a large neural network to handle high-dimensional imaging tasks such as 3D MRI. This limits traditional training algorithms based on backpropagation due to prohibitively large memory and compute requirements for calculating gradients and storing intermediate activations. To address this challenge, we propose Greedy LEarning for Accelerated MRI (GLEAM) reconstruction, an efficient training strategy for high-dimensional imaging settings. GLEAM splits the end-to-end network into decoupled network modules. Each module is optimized in a greedy manner with decoupled gradient updates, reducing the memory footprint during training. We show that the decoupled gradient updates can be performed in parallel on multiple graphical processing units (GPUs) to further reduce training time. We present experiments with 2D and 3D datasets including multi-coil knee, brain, and dynamic cardiac cine MRI. We observe that: i) GLEAM generalizes as well as state-of-the-art memory-efficient baselines such as gradient checkpointing and invertible networks with the same memory footprint, but with 1.3x faster training; ii) for the same memory footprint, GLEAM yields 1.1dB PSNR gain in 2D and 1.8 dB in 3D over end-to-end baselines.
翻译:松动的神经网络最近实现了最先进的加速 MRI 重建。 这些网络通过在物理一致性和神经网络正规化之间交替使用物理上的一致性和神经网络的正规化,释放了互动优化算法。 但是,它们需要对大型神经网络进行几次迭代,才能处理3D MRI等高维成像任务。 这限制了基于后向回向分析的传统培训算法,因为计算梯度和存储中间启动所需的回向式的记忆和计算要求令人难以接受的庞大记忆和计算。 为了应对这一挑战,我们提议为加速MRI(GLEAM)重建而进行贪婪的LEAR(GLEAM)访问,这是高维成成像图像设置的高效培训策略。 GLEAM将端网络分为一个拆分解的网络,以贪婪的方式优化每个模块,通过拆解开的梯度更新,减少培训过程中的记忆足迹。 我们表明,解动的梯度更新可以在多个图形处理器(GPUD) 进一步减少培训时间。 我们用2D 和3D 数据集进行实验,包括多层的膝盖、大脑、大脑和动态内径、GLELE- Stal- Stal- Stild- Stild 等G-NLIM- Stal- Stild-S 和G-I- Stal:我们观察G-S 和G-S 的G- Stild- Stild-S-S-S 的G- Stild- Stild-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-Silg-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S