Fluorescence lifetime imaging (FLI) has been receiving increased attention in recent years as a powerful diagnostic technique in biological and medical research. However, existing FLI systems often suffer from a tradeoff between processing speed, accuracy, and robustness. In this paper, we propose a robust approach that enables fast FLI with no degradation of accuracy. The approach is based on a SPAD TCSPC system coupled to a recurrent neural network (RNN) that accurately estimates the fluorescence lifetime directly from raw timestamps without building histograms, thereby drastically reducing transfer data volumes and hardware resource utilization, thus enabling FLI acquisition at video rate. We train two variants of the RNN on a synthetic dataset and compare the results to those obtained using center-of-mass method (CMM) and least squares fitting (LS fitting). Results demonstrate that two RNN variants, gated recurrent unit (GRU) and long short-term memory (LSTM), are comparable to CMM and LS fitting in terms of accuracy, while outperforming them in background noise by a large margin. To explore the ultimate limits of the approach, we derived the Cramer-Rao lower bound of the measurement, showing that RNN yields lifetime estimations with near-optimal precision. Moreover, our FLI model, which is purely trained on synthetic datasets, works well with never-seen-before, real-world data. To demonstrate real-time operation, we have built a FLI microscope based on Piccolo, a 32x32 SPAD sensor developed in our lab. Four quantized GRU cores, capable of processing up to 4 million photons per second, are deployed on a Xilinx Kintex-7 FPGA. Powered by the GRU, the FLI setup can retrieve real-time fluorescence lifetime images at up to 10 frames per second. The proposed FLI system is promising and ideally suited for biomedical applications.
翻译:暂无翻译