For reconstructing large tomographic datasets fast, filtered backprojection-type or Fourier-based algorithms are still the method of choice, as they have been for decades. These robust and computationally efficient algorithms have been integrated in a broad range of software packages. Despite the fact that the underlying mathematical formulas used for image reconstruction are unambiguous, variations in discretisation and interpolation result in quantitative differences between reconstructed images obtained from different software. This hinders reproducibility of experimental results. In this paper, we propose a way to reduce such differences by optimising the filter used in analytical algorithms. These filters can be computed using a wrapper routine around a black-box implementation of a reconstruction algorithm, and lead to quantitatively similar reconstructions. We demonstrate use cases for our approach by computing implementation-adapted filters for several open-source implementations and applying it to simulated phantoms and real-world data acquired at the synchrotron. Our contribution to a reproducible reconstruction step forms a building block towards a fully reproducible synchrotron tomography data processing pipeline.
翻译:快速重建大型图像数据集,过滤后投影类型或基于Fourier的算法仍然是选择的方法,与几十年一样。这些稳健和计算高效的算法已经融入了广泛的软件包中。尽管用于图像重建的基本数学公式十分明确,但离散和内插的变化导致从不同软件获得的重建图像之间的数量差异。这妨碍了实验结果的可复制性。在本文中,我们建议了一种方法,通过优化分析算法中使用的过滤器来减少这种差异。这些过滤器可以使用黑盒执行重建算法的包装例行程序来计算,并导致数量上类似的重建。我们展示了我们的方法,为若干开放源的安装计算实施适应过滤器,并将其应用到模拟的幻影和同步器获取的真实世界数据中。我们对于可复制的重建步骤的贡献构成了一个建筑构件,形成一个完全可再生同步的同步器处理数据管道。