We propose a framework for second-order achievability, called type deviation convergence, that is generally applicable to settings in network information theory, and is especially suitable for lossy source coding and channel coding with cost. We give a second-order achievability bound for lossy source coding with side information at the decoder (Wyner-Ziv problem) that improves upon all known bounds (e.g., Watanabe-Kuzuoka-Tan, Yassaee-Aref-Gohari and Li-Anantharam). We also give second-order achievability bounds for lossy compression where side information may be absent (Heegard-Berger problem) and channels with noncausal state information at the encoder and cost constraint (Gelfand-Pinsker problem with cost) that improve upon previous bounds.
翻译:暂无翻译