Time-lapse fluorescent microscopy (TLFM) combined with predictive mathematical modelling is a powerful tool to study the inherently dynamic processes of life on the single-cell level. Such experiments are costly, complex and labour intensive. A complimentary approach and a step towards in silico experimentation, is to synthesise the imagery itself. Here, we propose Multi-StyleGAN as a descriptive approach to simulate time-lapse fluorescence microscopy imagery of living cells, based on a past experiment. This novel generative adversarial network synthesises a multi-domain sequence of consecutive timesteps. We showcase Multi-StyleGAN on imagery of multiple live yeast cells in microstructured environments and train on a dataset recorded in our laboratory. The simulation captures underlying biophysical factors and time dependencies, such as cell morphology, growth, physical interactions, as well as the intensity of a fluorescent reporter protein. An immediate application is to generate additional training and validation data for feature extraction algorithms or to aid and expedite development of advanced experimental techniques such as online monitoring or control of cells. Code and dataset is available at https://git.rwth-aachen.de/bcs/projects/tp/multi-stylegan.
翻译:时间过荧光显微镜(TLFM)与预测数学模型相结合,是研究单细胞一级固有的动态生命过程的有力工具,这种实验是昂贵、复杂和劳动密集的,这种实验是昂贵、复杂和费力的。在硅试验中,一个辅助性的方法和一步是合成图像本身。在这里,我们建议多StyleGAN作为描述性方法,以模拟活细胞的时过荧光显微镜图像,以过去的实验为基础。这个新型的基因对抗网络合成了一个多多维的连续时间步骤序列。我们展示了多StyleGAN关于微结构环境中多个活性酵母细胞的图像,并用我们实验室记录的一个数据集进行培训。模拟捕捉了生物物理因素和时间依赖性,例如细胞形态学、生长、物理互动以及荧光报告器蛋白的强度。一个即时应用是生成更多关于特征提取算法的培训和验证数据,或帮助和加速开发先进的实验技术,例如细胞的在线监测或控制。代码和数据设置在 http://gistras-chenc/dmaps/drops/dminuts.