In information theory -- as well as in the adjacent fields of statistics, machine learning, artificial intelligence, signal processing and pattern recognition -- many flexibilizations of the omnipresent Kullback-Leibler information distance (relative entropy) and of the closely related Shannon entropy have become frequently used tools. To tackle corresponding constrained minimization (respectively maximization) problems by a newly developed dimension-free bare (pure) simulation method, is the main goal of this paper. Almost no assumptions (like convexity) on the set of constraints are needed, within our discrete setup of arbitrary dimension, and our method is precise (i.e., converges in the limit). As a side effect, we also derive an innovative way of constructing new useful distances/divergences. To illustrate the core of our approach, we present numerous examples. The potential for widespread applicability is indicated, too; in particular, we deliver many recent references for uses of the involved distances/divergences and entropies in various different research fields (which may also serve as an interdisciplinary interface).
翻译:在信息理论方面 -- -- 以及在统计、机器学习、人工智能、信号处理和模式识别等相邻领域 -- -- 大量灵活地利用无处不在的库尔回背-利贝尔信息距离(relback-Libel)和密切相关的香农对流器,已成为经常使用的工具。为了通过新开发的无尺寸光(纯)模拟方法解决相应的限制最小化(分别最大化)问题,本文件的主要目标是:在我们各自独立的任意维度结构内,几乎不需要对一系列制约因素进行假设(如共性),而且我们的方法也很精确(即在极限内趋同)。作为副作用,我们还提出了一种创新的方法,以构建新的有用的距离/点。为了说明我们的方法的核心,我们举了许多例子。也表明了广泛适用的可能性;特别是,我们为不同研究领域(也可能作为跨学科界面)的距离/节点和内容提供了许多近期参考。