To train a deep neural network to mimic the outcomes of processing sequences, a version of Conditional Generalized Adversarial Network (CGAN) can be used. It has been observed by others that CGAN can help to improve the results even for deterministic sequences, where only one output is associated with the processing of a given input. Surprisingly, our CGAN-based tests on deterministic geophysical processing sequences did not produce a real improvement compared to the use of an $L_p$ loss; we here propose a first theoretical explanation why. Our analysis goes from the non-deterministic case to the deterministic one. It led us to develop an adversarial way to train a content loss that gave better results on our data.
翻译:为了训练一个深层神经网络来模仿加工序列的结果,可以使用一个有条件通用反转网络(CGAN)的版本。其他人发现,CGAN可以帮助改善结果,即使是确定性序列也是如此,因为只有一项产出与某一输入的处理有关。令人惊讶的是,我们基于确定性地球物理处理序列的CGAN测试与使用美元损失相比并没有产生真正的改善;我们在此提出第一个理论解释。我们的分析从非确定性案例到确定性案例。它引导我们开发了一种对抗性方法来培训内容损失,从而给我们的数据带来更好的结果。