A Bayesian Network is a directed acyclic graph (DAG) on a set of $n$ random variables (the vertices); a Bayesian Network Distribution (BND) is a probability distribution on the random variables that is Markovian on the graph. A finite $k$-mixture of such models is graphically represented by a larger graph which has an additional ``hidden'' (or ``latent'') random variable $U$, ranging in $\{1,\ldots,k\}$, and a directed edge from $U$ to every other vertex. Models of this type are fundamental to causal inference, where $U$ models an unobserved confounding effect of multiple populations, obscuring the causal relationships in the observable DAG. By solving the mixture problem and recovering the joint probability distribution on $U$, traditionally unidentifiable causal relationships become identifiable. Using a reduction to the more well-studied ``product'' case on empty graphs, we give the first algorithm to learn mixtures of non-empty DAGs.
翻译:Bayesian Network是一组美元随机变量(脊椎)的定向自行车图(DAG),Bayesian 网络分布(BND)是随机变量的概率分布,图中为Markovian。这种模型的有限美元混合体用一个更大的图表示,该图具有额外的“hidden'(或“lateent'”)随机变量U$,范围从$1,\ldots,k<unk> $,以及从美元到其他每个顶点的直线边。这种模型对于因果推断至关重要,因为美元模型模拟了多个人群的未观察到的粘合效应,掩盖了可观测的DAG中的因果关系。通过解决混合物问题并恢复美元(或“lateent' ”)的联合概率分布,传统上无法识别的因果关系变得可以识别。在空图上对研究得更精准的“产品”案例使用减法,我们第一次给出了算法来学习非空的DAG的混合物混合物。</s>