In information theory -- as well as in the adjacent fields of statistics, machine learning, artificial intelligence, signal processing and pattern recognition -- many flexibilizations of the omnipresent Kullback-Leibler information distance (relative entropy) and of the closely related Shannon entropy have become frequently used tools. To tackle corresponding constrained minimization (respectively maximization) problems by a newly developed dimension-free bare (pure) simulation method, is the main goal of this paper. Almost no assumptions (like convexity) on the set of constraints are needed, within our discrete setup of arbitrary dimension, and our method is precise (i.e., converges in the limit). As a side effect, we also derive an innovative way of constructing new useful distances/divergences. To illustrate the core of our approach, we present numerous solved cases. The potential for widespread applicability is indicated, too; in particular, we deliver many recent references for uses of the involved distances/divergences and entropies in various different research fields (which may also serve as an interdisciplinary interface).
翻译:在信息理论方面 -- -- 以及相邻的统计、机器学习、人工智能、信号处理和模式识别领域 -- -- 大量灵活地利用无处不在的库尔回背-利贝尔信息距离(relback-Libel)和密切相关的香农对流器,已成为经常使用的工具。为了通过新开发的无尺寸光(纯)模拟方法解决相应的限制最小化(分别最大化)问题,本文件的主要目标是:在我们各自独立的任意维度结构内,几乎不需要对一系列制约因素进行假设(如共性),而且我们的方法也很精确(即在极限内趋同)。作为副作用,我们还提出了一种创新的方法,以构建新的有用的距离/点。为了说明我们的方法的核心,我们提出了许多案例。也表明广泛适用的可能性;特别是,我们提供了许多近期的参考资料,用于不同研究领域的有关距离/强度和内容的使用(也可以作为跨学科的界面)。