Learning to translate images from a source to a target domain with applications such as converting simple line drawing to oil painting has attracted significant attention. The quality of translated images is directly related to two crucial issues. First, the consistency of the output distribution with that of the target is essential. Second, the generated output should have a high correlation with the input. Conditional Generative Adversarial Networks, cGANs, are the most common models for translating images. The performance of a cGAN drops when we use a limited training dataset. In this work, we increase the Pix2Pix (a form of cGAN) target distribution modeling ability with the help of dynamic neural network theory. Our model has two learning cycles. The model learns the correlation between input and ground truth in the first cycle. Then, the model's architecture is refined in the second cycle to learn the target distribution from noise input. These processes are executed in each iteration of the training procedure. Helping the cGAN learn the target distribution from noise input results in a better model generalization during the test time and allows the model to fit almost perfectly to the target domain distribution. As a result, our model surpasses the Pix2Pix model in segmenting HC18 and Montgomery's chest x-ray images. Both qualitative and Dice scores show the superiority of our model. Although our proposed method does not use thousand of additional data for pretraining, it produces comparable results for the in and out-domain generalization compared to the state-of-the-art methods.
翻译:从源到目标域学习将图像从源到目标域的图像从源到目标域的转换,例如将简单的线条绘图转换成油画等应用程序,引起了人们的极大关注。 翻译图像的质量与两个关键问题直接相关。 首先, 输出分布与目标的一致至关重要。 第二, 生成的输出应该与输入具有高度的关联性。 有条件的基因对流网络, cGANs, 是翻译图像的最常见模式。 当我们使用有限的培训数据集时, cGAN的性能会下降。 在这项工作中, 我们增加了 Pix2Pix( cGAN的一种形式) 目标分布模型能力, 并借助动态神经网络理论。 我们的模型有两个学习周期。 该模型在第一个周期中学习输入和地面真理之间的关联性关系。 然后, 在第二个周期里, 模型的结构会得到改进, 以便从噪音输入中学习目标的分布。 这些过程在培训模型的每一次循环中都会执行。 帮助cGAN从噪音输入中学习目标分布的结果, 在测试时, 更好的模型在测试时间里, 使得模型能够完全符合HC 目标级分布 。 。 的模型 将结果 显示 和 Cral 。