Let $\sigma$ be a first-order signature and let $\mathbf{W}_n$ be the set of all $\sigma$-structures with domain $[n] = \{1, \ldots, n\}$. By an inference framework we mean a class $\mathbf{F}$ of pairs $(\mathbb{P}, L)$, where $\mathbb{P} = (\mathbb{P}_n : n = 1, 2, 3, \ldots)$ and each $\mathbb{P}_n$ is a probability distribution on $\mathbf{W}_n$, and $L$ is a logic with truth values in the unit interval $[0, 1]$. The inference frameworks we consider contain pairs $(\mathbb{P}, L)$ where $\mathbb{P}$ is determined by a probabilistic graphical model and and $L$ expresses statements about, for example, (conditional) probabilities or (arithmetic or geometric) averages. We define asymptotic expressivity of inference frameworks: $\mathbf{F}'$ is asymptotically at least as expressive as $\mathbf{F}$ if for every $(\mathbb{P}, L) \in \mathbf{F}$ there is $(\mathbb{P}', L') \in \mathbf{F}'$ such that $\mathbb{P}$ is asymptotically total-variation-equivalent to $\mathbb{P}'$ and for every $\varphi(\bar{x}) \in L$ there is $\varphi'(\bar{x}) \in L'$ such that $\varphi'(\bar{x})$ is asymptotically equivalent to $\varphi(\bar{x})$ with respect to $\mathbb{P}$. This relation is a preorder and we describe a (strict) partial order on the equivalence classes of some inference frameworks that seem natural in the context of machine learning and artificial intelligence. Our analysis includes Conditional Probability Logic ($CPL$) and Probability Logic with Aggregation functions ($PLA$) introduced in earlier work. We also define a sublogic $coPLA$ of $PLA$ in which the aggregation functions satisfy additional continuity constraints and show that $coPLA$ gives rise to asymptotically strictly less expressive inference frameworks than $PLA$.
翻译:$\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\可以\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\