Hypothesis testing and the usage of expert knowledge, or causal priors, has not been well explored in the context of generative models. We propose a novel set of generative architectures, Causal Gen and Causal Variational Gen, that can utilize nonparametric structural causal knowledge combined with a deep learning functional approximation. We show how, using a deliberate (non-random) split of training and testing data, these models can generalize better to similar, but out-of-distribution data points, than non-causal generative models and prediction models such as Variational autoencoders and Fully Connected Neural Networks. We explore using this generalization error as a proxy for causal model hypothesis testing. We further show how dropout can be used to learn functional relationships of structural models that are difficult to learn with traditional methods. We validate our methods on a synthetic pendulum dataset, as well as a trauma surgery ground level fall dataset.
翻译:在基因模型方面,没有很好地探讨假说测试和专家知识的使用或因果前科,我们提出了一套新型的基因结构,即Causal Gen和Causal Variational Gen,能够利用非对称结构性因果知识,再加上深层次的学习功能近似,我们展示了如何使用有意(非随机)的培训和测试数据,这些模型可以比非因果基因模型和预测模型,如Variational Autoccorders和完全连通神经网络等,更好地概括类似但超出分布的数据点。我们探索如何利用这一通用错误作为因果模型假设测试的替代物。我们进一步展示了如何利用辍学来学习难以用传统方法学习的结构模型的功能关系。我们验证了合成的圆顶数据集的方法,以及创伤外科地面水平下降数据集。