We study channel simulation under common randomness-assistance in the finite-blocklength regime and identify the smooth channel max-information as a linear program one-shot converse on the minimal simulation cost for fixed error tolerance. We show that this one-shot converse can be achieved exactly using no-signaling assisted codes, and approximately achieved using common randomness-assisted codes. Our one-shot converse thus takes on an analogous role to the celebrated meta-converse in the complementary problem of channel coding, and find tight relations between these two bounds. We asymptotically expand our bounds on the simulation cost for discrete memoryless channels, leading to the second-order as well as the moderate deviation rate expansion, which can be expressed in terms of the channel capacity and channel dispersion known from noisy channel coding. Our techniques extend to discrete memoryless broadcast channels. In stark contrast to the elusive broadcast channel capacity problem, we show that the reverse problem of broadcast channel simulation under common randomness-assistance allows for an efficiently computable single-letter characterization of the asymptotic rate region in terms of the broadcast channel's multi-partite mutual information. Finally, we present a Blahut-Arimoto type algorithm to compute the rate region efficiently.
翻译:暂无翻译