Robust inference based on the minimization of statistical divergences has proved to be a useful alternative to classical techniques based on maximum likelihood and related methods. Basu et al. (1998) introduced the density power divergence (DPD) family as a measure of discrepancy between two probability density functions and used this family for robust estimation of the parameter for independent and identically distributed data. Ghosh et al. (2017) proposed a more general class of divergence measures, namely the S-divergence family and discussed its usefulness in robust parametric estimation through several asymptotic properties and some numerical illustrations. In this paper, we develop the results concerning the asymptotic breakdown point for the minimum S-divergence estimators (in particular the minimum DPD estimator) under general model setups. The primary result of this paper provides lower bounds to the asymptotic breakdown point of these estimators which are independent of the dimension of the data, in turn corroborating their usefulness in robust inference under high dimensional data.
翻译:最小S-离散度估计量的断点分析
The translated abstract
离群值分析是一种基于统计分散的鲁棒推断方法,被证明是传统的基于极大似然和相关方法的有效替代方案。Basu等人(1998)将密度功率离散度(DPD)族引入为两个概率密度函数之间的差异度量,并用这个族对独立同分布数据的参数进行了鲁棒估计。Ghosh等人(2017)提出了更一般的差异测度类,即S-离散度族,并通过几个渐近性质和数值示例,讨论了它们在鲁棒参数估计中的实用性。本文在一般模型设置下,针对最小S-离散度估计器(尤其是最小DPD估计器),开展了渐近断点分析的结果。本文的主要结果为这些估计器的渐近断点提供了下界,这些下界与数据维数无关,从而证实了它们在高维数据下鲁棒推断中的实用性。