We present generalized balancing weights, Neural Balancing Weights (NBW), to estimate the causal effects for an arbitrary mixture of discrete and continuous interventions. The weights were obtained by directly estimating the density ratio between the source and balanced distributions by optimizing the variational representation of $f$-divergence. For this, we selected $\alpha$-divergence since it has good properties for optimization: It has an estimator whose sample complexity is independent of it's ground truth value and unbiased mini-batch gradients and is advantageous for the vanishing gradient problem. In addition, we provide a method for checking the balance of the distribution changed by the weights. If the balancing is imperfect, the weights can be improved by adding new balancing weights. Our method can be conveniently implemented with any present deep-learning libraries, and weights can be used in most state-of-the-art supervised algorithms. The code for our method is available online.
翻译:我们提出了通用平衡权重,即神经平衡权重(NBW),以估计离散和连续干预任意混合的因果关系。加权是直接估计来源和均衡分布之间的密度比,通过优化美元差异性表示法优化了来源和均衡分布之间的密度比。为此,我们选择了美元(alpha$-digence),因为它具有良好的优化性能:它有一个估计器,其样本复杂性独立于其地面真实值和不带偏见的微型批量梯度,并且对消失的梯度问题有利。此外,我们提供了一种检查因重量而变化的分布平衡平衡的方法。如果平衡不完善,则可以通过增加新的平衡权重来改进重量。我们的方法可以方便地与任何现有的深造图书馆一起实施,重量可以用于大多数最先进的监管算法中。我们方法的代码可以在线查阅。</s>