We propose a novel sensitivity analysis framework for linear estimands when identification failure can be viewed as seeing the wrong distribution of outcomes. Our family of assumptions bounds the density ratio between the observed and true conditional outcome distribution. This framework links naturally to selection models, generalizes existing assumptions for the Regression Discontinuity (RD) and Inverse Propensity Weighting (IPW) estimand, and provides a novel nonparametric perspective on violations of identification assumptions for ordinary least squares (OLS). Our sharp partial identification results extend existing results for IPW to cover other estimands and assumptions that allow even unbounded likelihood ratios, yielding a simple and unified characterization of bounds under assumptions like c-dependence of Masten and Poirier (2018). The sharp bounds can be written as a simple closed form moment of the data, the nuisance functions estimated in the primary analysis, and the conditional outcome quantile function. We find our method does well in simulations even when targeting a discontinuous and nearly infinite bound.
翻译:暂无翻译