A Bayesian Network is a directed acyclic graph (DAG) on a set of $n$ random variables (the vertices); a Bayesian Network Distribution (BND) is a probability distribution on the random variables that is Markovian on the graph. A finite $k$-mixture of such models is graphically represented by a larger graph which has an additional "hidden" (or "latent") random variable $U$, ranging in $\{1,\ldots,k\}$, and a directed edge from $U$ to every other vertex. Models of this type are fundamental to causal inference, where $U$ models an unobserved confounding effect of multiple populations, obscuring the causal relationships in the observable DAG. By solving the mixture problem and recovering the joint probability distribution on $U$, traditionally unidentifiable causal relationships become identifiable. Using a reduction to the more well-studied "product" case on empty graphs, we give the first algorithm to learn mixtures of non-empty DAGs.
翻译:Bayesian Network是一组美元随机变量(脊椎)的定向自行车图(DAG);Bayesian 网络分布(BND)是随机变量(图中为Markovian的随机变量)的概率分布。这种模型的有限美元混合体用一个更大的图表示,该图具有额外的“隐藏”(或“老化”)随机变量U$,范围从$1,\ldots,k<unk> ],以及从美元到其他每个顶部的直线边。这种模型是因果推断的基础,其中美元模型是多个人口未观察到的粘合效应,掩盖了可观测的DAG中的因果关系。通过解决混合物问题并恢复美元(或“老化”)的联合概率分布,传统上无法识别的因果关系可以被识别。在空图中,我们用对研究得更精良的“产品”案例进行减值,我们首先用算法来学习非破产的DAG的混合物。</s>