Optimal transport has numerous applications, particularly in machine learning tasks involving generative models. In practice, the transportation process often encounters an information bottleneck, typically arising from the conversion of a communication channel into a rate-limited bit pipeline using error correction codes. While this conversion enables a channel-oblivious approach to optimal transport, it fails to fully exploit the available degrees of freedom. Motivated by the emerging paradigm of generative communication, this paper examines the problem of channel-aware optimal transport, where a block of i.i.d. random variables is transmitted through a memoryless channel to generate another block of i.i.d. random variables with a prescribed marginal distribution such that the end-to-end distortion is minimized. With unlimited common randomness available to the encoder and decoder, the source-channel separation architecture is shown to be asymptotically optimal as the blocklength approaches infinity. On the other hand, in the absence of common randomness, the source-channel separation architecture is generally suboptimal. For this scenario, a hybrid coding scheme is proposed, which partially retains the generative capabilities of the given channel while enabling reliable transmission of digital information. It is demonstrated that the proposed hybrid coding scheme can outperform both separation-based and uncoded schemes.
翻译:暂无翻译