The Augustin--Csisz{\' a}r mutual information (MI) and Lapidoth--Pfister MI are well-known generalizations of the Shannon MI, but do not have known closed-form expressions, so they need to be calculated by solving optimization problems. In this study, we propose alternating optimization algorithms for computing these types of MI and present proofs of their global convergence properties. We also provide a novel variational characterization of the Augustin--Csisz{\' a}r MI that is similar to that of the Sibson MI.
翻译:暂无翻译