The mutual information is analyzed as a function of the input distribution using an identity due to Tops\o{e} for channels with (possibly multiple) linear cost constraints and finite input and output sets. The mutual information is bounded above by a function decreasing quadratically with the distance to the set of all capacity-achieving input distributions for the case when the distance is less than a certain threshold. The closed-form expressions for the threshold and the coefficient of the quadratic decrease are derived. A counter-example demonstrating the non-existence of such a quadratic bound in the case of infinitely many linear cost constraints is provided. Implications of these observations for the channel coding problem and applications of the proof technique to related problems are discussed.
翻译:暂无翻译