Uncertain input of a mathematical model induces uncertainties in the output and probabilistic sensitivity analysis identifies the influential inputs to guide decision-making. Of practical concern is the probability that the output would, or would not, exceed a threshold, and the probability sensitivity depends on this threshold which is often uncertain. The Fisher information and the Kullback-Leibler divergence have been recently proposed in the literature as threshold-independent sensitivity metrics. We present mathematical proof that the information-theoretical metrics provide an upper bound for the probability sensitivity. The proof is elementary, relying only on a special version of the Cauchy-Schwarz inequality called Titu's lemma. Despite various inequalities exist for probabilities, little is known of probability sensitivity bounds and the one proposed here is new to the present authors' knowledge. The probability sensitivity bound is extended, analytically and with numerical examples, to the Fisher information of both the input and output. It thus provides a solid mathematical basis for decision-making based on probabilistic sensitivity metrics.
翻译:数学模型的不确定投入在产出和概率敏感度分析中引起不确定因素,从而确定指导决策的有影响力的投入。实际关注的问题是产出会超过或不会超过临界值的概率,而概率敏感度取决于这个往往不确定的阈值。文献中最近提出了渔业信息和Kullback-Libeller的差异,作为离阈值独立的敏感度指标。我们提供了数学证据,证明信息理论指标为概率敏感度提供了上限。证据是基本的,仅依赖于Cauchy-Schwarz不平等的特殊版本,称为Titu的利玛。尽管概率存在各种不平等,但概率敏感度的界限却鲜为人所知,而此处拟议的概率的界限对目前作者的知识来说是新的。概率敏感度从分析角度和数字实例上延伸至渔业投入和产出的信息。因此,它为基于概率敏感度指标的决策提供了坚实的数学基础。