We present generalized balancing weights, Neural Balancing Weights (NBW), to estimate the causal effects for an arbitrary mixture of discrete and continuous interventions. The weights were obtained by directly estimating the density ratio between the source and balanced distributions by optimizing the variational representation of $f$-divergence. For this, we selected $\alpha$-divergence since it has good properties for optimization: It has a $\sqrt{N}$-consistency estimator and unbiased mini-batch gradients and is advantageous for the vanishing gradient problem. In addition, we provide a method for checking the balance of the distribution changed by the weights. If the balancing is imperfect, the weights can be improved by adding new balancing weights. Our method can be conveniently implemented with any present deep-learning libraries, and weights can be used in most state-of-the-art supervised algorithms.
翻译:我们提出了通用平衡权重,即神经平衡权重(NBW),以估计离散和连续干预任意混合的因果关系。加权是直接估计源与均衡分布之间的密度比,通过优化美元差异性表示法优化了源与平衡分布之间的密度比。为此,我们选择了美元(alpha$-digence),因为它具有良好的优化性能:它有一个美元(sqrt{N})的一致估计值和不带偏见的微型批量梯度,并且对消失的梯度问题有利。此外,我们提供了一种方法来检查因重量而变化的分布平衡。如果平衡不完善,则可以通过增加新的平衡权重来改进加权。我们的方法可以方便地在任何深层次的图书馆中加以实施,而权重可以用于大多数最先进的监督算法中。