Causal inference is to estimate the causal effect in a causal relationship when intervention is applied. Precisely, in a causal model with binary interventions, i.e., control and treatment, the causal effect is simply the difference between the factual and counterfactual. The difficulty is that the counterfactual may never been obtained which has to be estimated and so the causal effect could only be an estimate. The key challenge for estimating the counterfactual is to identify confounders which effect both outcomes and treatments. A typical approach is to formulate causal inference as a supervised learning problem and so counterfactual could be predicted. Including linear regression and deep learning models, recent machine learning methods have been adapted to causal inference. In this paper, we propose a method to estimate Causal Effect by using Variational Information Bottleneck (CEVIB). The promising point is that VIB is able to naturally distill confounding variables from the data, which enables estimating causal effect by using observational data. We have compared CEVIB to other methods by applying them to three data sets showing that our approach achieved the best performance. We also experimentally showed the robustness of our method.
翻译:原因推论是估计干预应用时因果关系的因果关系。确切地说,在涉及二元干预的因果关系模型中,即控制和治疗,因果关系的效应只是事实和反事实之间的差别。困难在于无法得出必须估计的反事实,因此因果关系只能是估计。估计反事实的关键挑战是找出既影响结果又影响治疗的混淆因素。典型的方法是将因果关系推断作为受监督的学习问题,从而可以预测反事实。包括线性回归和深层学习模型,最近的机器学习方法已经适应了因果关系推断。在本文中,我们提出了一个方法,通过使用Variational Information Bottleneck(CEVIB)来估计因果关系。有希望的一点是,VIB能够自然地从数据中分离相近的变量,从而能够通过观察数据来估计因果关系。我们通过将CEVIB与其他方法进行比较,将它们与其他方法进行比较,将线性回归模型和深层次的学习模型加以调整,最近的机器学习方法已经适应了因果关系的推断。我们在本文中提出了一种方法,通过实验性地显示了我们的方法的稳健。