We formulate and analyze the compound information bottleneck programming. In this problem, a Markov chain $ \mathsf{X} \rightarrow \mathsf{Y} \rightarrow \mathsf{Z} $ is assumed with fixed marginal distributions $\mathsf{P}_{\mathsf{X}}$ and $\mathsf{P}_{\mathsf{Y}}$, and the mutual information between $ \mathsf{X} $ and $ \mathsf{Z} $ is sought to be maximized over the choice of conditional probability of $\mathsf{Z}$ given $\mathsf{Y}$ from a given class, under the \textit{worst choice} of the joint probability of the pair $(\mathsf{X},\mathsf{Y})$ from a different class. We consider several classes based on extremes of: mutual information; minimal correlation; total variation; and the relative entropy class. We provide values, bounds, and various characterizations for specific instances of this problem: the binary symmetric case, the scalar Gaussian case, the vector Gaussian case and the symmetric modulo-additive case. Finally, for the general case, we propose a Blahut-Arimoto type of alternating iterations algorithm to find a consistent solution to this problem.
翻译:我们制定并分析复合信息瓶颈程序。 在此问题上, 正在寻找一个 Markov 链 $ mathsf{X}\ mathsf{Y}\rightrow rior $ mathsf{Y}\rightrow rightrow \mathsf}$ mathsf} 美元, 和 $mathssf{P ⁇ }Y} 和 $\mathsf} 之间的相互信息。 在这个问题中, 正在寻找一个 Markov 链 链 $\ mathsf{X}\ mathsright ralass f}\r\ right rmalrow $ rmals lax\ malls 的最大化概率, 在\ textsmallitals a case: we valtical- case, libly a clasdaltiably case, 和 dalticals case. 我们为普通的直数。