Exponential families are statistical models which are the workhorses in statistics, information theory, and machine learning. An exponential family can either be normalized subtractively by its cumulant function or equivalently normalized divisively by its partition function. Both subtractive and divisive normalizers are strictly convex and smooth functions inducing pairs of Bregman and Jensen divergences. It is well-known that skewed Bhattacharryya distances between probability densities of an exponential family amounts to skewed Jensen divergences induced by the cumulant function between their corresponding natural parameters, and in limit cases that the sided Kullback-Leibler divergences amount to reverse-sided Bregman divergences. In this note, we first show that the $\alpha$-divergences between unnormalized densities of an exponential family amounts scaled $\alpha$-skewed Jensen divergences induced by the partition function. We then show how comparative convexity with respect to a pair of quasi-arithmetic means allows to deform convex functions and define dually flat spaces with corresponding divergences when ordinary convexity is preserved.
翻译:暂无翻译