Accelerating magnetic resonance image (MRI) reconstruction process is a challenging ill-posed inverse problem due to the excessive under-sampling operation in k-space. In this paper, we propose a recurrent transformer model, namely ReconFormer, for MRI reconstruction which can iteratively reconstruct high fertility magnetic resonance images from highly under-sampled k-space data. In particular, the proposed architecture is built upon Recurrent Pyramid Transformer Layers (RPTL), which jointly exploits intrinsic multi-scale information at every architecture unit as well as the dependencies of the deep feature correlation through recurrent states. Moreover, the proposed ReconFormer is lightweight since it employs the recurrent structure for its parameter efficiency. We validate the effectiveness of ReconFormer on multiple datasets with different magnetic resonance sequences and show that it achieves significant improvements over the state-of-the-art methods with better parameter efficiency. Implementation code will be available in https://github.com/guopengf/ReconFormer.
翻译:加速磁共振图像重建( MRI) 进程是一个挑战性错误的反向问题, 原因是在 k- space 中过度的低采样操作。 在本文中, 我们提议了一个反复的变压器模型, 即 ReconFormer, 用于 MRI 重建, 它可以从高度低采样的 k- space 数据中迭接地重建高肥力磁共振图像。 特别是, 拟议的结构建在经常的 Pyradmid 变异层( RPTL ) 上, 它共同利用每个建筑单元的内在多尺度信息, 以及经常状态下深度特征相关性的依存性。 此外, 拟议的 Reconformer 是轻量的, 因为它使用经常结构来提高参数效率。 我们验证了重塑多数据集的重塑效果, 其具有不同的磁共振相序列, 并显示它以更高的参数效率在状态- 艺术 方法上取得了显著的改进。 执行代码将在 http://github. com/ gurepengg/ Reconmerf/ Reconfermerf/ Reconfermerf) 。