Minimum divergence procedures based on the density power divergence and the logarithmic density power divergence have been extremely popular and successful in generating inference procedures which combine a high degree of model efficiency with strong outlier stability. Such procedures are always preferable in practical situations over procedures which achieve their robustness at a major cost of efficiency or are highly efficient but have poor robustness properties. The density power divergence (DPD) family of Basu et al.(1998) and the logarithmic density power divergence (LDPD) family of Jones et al.(2001) provide flexible classes of divergences where the adjustment between efficiency and robustness is controlled by a single, real, non-negative parameter. The usefulness of these two families of divergences in statistical inference makes it meaningful to search for other related families of divergences in the same spirit. The DPD family is a member of the class of Bregman divergences, and the LDPD family is obtained by log transformations of the different segments of the divergences within the DPD family. Both the DPD and LDPD families lead to the Kullback-Leibler divergence in the limiting case as the tuning parameter $\alpha \rightarrow 0$. In this paper we study this relation in detail, and demonstrate that such log transformations can only be meaningful in the context of the DPD (or the convex generating function of the DPD) within the general fold of Bregman divergences, giving us a limit to the extent to which the search for useful divergences could be successful.
翻译:基于密度功率差异和对数密度功率差异的最小差异程序非常受欢迎,并成功地生成了推论程序,这种程序结合了高水平的模型效率与强度的偏差稳定性。在实际情况下,这种程序总是优于以主要效率成本实现稳健性或高度效率但又具有弱强性特性的程序。Basu等人(1998年)的密度功率差异(DPD)家族和Jones等人(2001年)的对数密度能力差异(LDPD)家族提供了灵活的差异类别,其中效率和稳健性之间的调整由一个单一的、真实的、非负式的搜索参数加以控制。这两个统计推论差异的两大类别在实际情况下比以同样精神寻找其他相关的强度程序更可取。DPD家族是Bregman 差异类别的成员,而LDPD家族可以通过D家族不同部分的日志变化获得。 DPD和LDD家庭在向 Kurback-leiber 的数值差异范围中,DPDD-lebrequenal 范围中,我们只能通过这个缩缩缩缩缩的参数来测试。