Sparse-view CT reconstruction is important in a wide range of applications due to limitations on cost, acquisition time, or dosage. However, traditional direct reconstruction methods such as filtered back-projection (FBP) lead to low-quality reconstructions in the sub-Nyquist regime. In contrast, deep neural networks (DNNs) can produce high-quality reconstructions from sparse and noisy data, e.g. through post-processing of FBP reconstructions, as can model-based iterative reconstruction (MBIR), albeit at a higher computational cost. In this paper, we introduce a direct-reconstruction DNN method called Recurrent Stacked Back Projection (RSBP) that uses sequentially-acquired backprojections of individual views as input to a recurrent convolutional LSTM network. The SBP structure maintains all information in the sinogram, while the recurrent processing exploits the correlations between adjacent views and produces an updated reconstruction after each new view. We train our network on simulated data and test on both simulated and real data and demonstrate that RSBP outperforms both DNN post-processing of FBP images and basic MBIR, with a lower computational cost than MBIR.
翻译:由于成本、购买时间或剂量的限制,在广泛的应用中,微缩的CT重建十分重要,因为成本、购买时间或剂量的限制,因此,传统的直接重建方法,例如过滤后回投(FBP)等传统直接重建方法,导致初级Nyquist制度下低质量的重建;相比之下,深神经网络(DNNS)能够利用稀少和噪音的数据,例如通过FBP重建后处理,产生高质量的重建,例如通过FBP重建后处理,通过基于模型的迭代重建(MBIR)产生高质量的重建,尽管计算成本较高;在本文中,我们采用了一种直接重建DNNN方法,称为经常性的Sacked后回投(RSBP),将个人观点的连续获得的后回投射作为经常性革命性LSTM网络的投入;而深神经网络结构则能够维持罪状图中的所有信息,例如通过FBP重建后处理,在每次新视图后产生更新的重建。我们用模拟数据和测试对我们的网络进行培训,并表明RSBP在模拟和真实数据上都比FBP图像的MMMMM的低成本的D后再加工。