Accelerating magnetic resonance image (MRI) reconstruction process is a challenging ill-posed inverse problem due to the excessive under-sampling operation in k-space. In this paper, we propose a recurrent transformer model, namely \textbf{ReconFormer}, for MRI reconstruction which can iteratively reconstruct high fertility magnetic resonance images from highly under-sampled k-space data. In particular, the proposed architecture is built upon Recurrent Pyramid Transformer Layers (RPTL), which jointly exploits intrinsic multi-scale information at every architecture unit as well as the dependencies of the deep feature correlation through recurrent states. Moreover, the proposed ReconFormer is lightweight since it employs the recurrent structure for its parameter efficiency. We validate the effectiveness of ReconFormer on multiple datasets with different magnetic resonance sequences and show that it achieves significant improvements over the state-of-the-art methods with better parameter efficiency. Implementation code will be available in https://github.com/guopengf/ReconFormer.
翻译:加速磁共振图像重建( MRI) 是一个具有挑战性的挑战性错误反向问题, 原因是在 k- space 中过度的低采样操作。 在本文中, 我们为磁共振重建建议了一个经常性变压器模型, 即\ textbf{ ReconFormer}, 用于迭接从高低采样的 k- space 数据中重建高活性磁共振图像。 特别是, 拟议的结构建在经常的 Pyramid 变异层( RPTL ) 上, 它共同利用了每个建筑单位的内在多尺度信息, 以及经常状态下深度特征相关性的依存性。 此外, 拟议的 Reconformer 是轻量的, 因为它使用经常结构来提高参数效率。 我们验证了使用不同磁共振波序列的多个数据集的Reconformer 的有效性, 并显示它以更好的参数效率在州- 方法上取得了显著的改进。 执行代码将在 https://github.com/guopeng/ Reconformerferf/ Reconfmer 。