Many functionals of interest in statistics and machine learning can be written as minimizers of expected loss functions. Such functionals are called $M$-estimands, and can be estimated by $M$-estimators -- minimizers of empirical average losses. Traditionally, statistical inference (e.g., hypothesis tests and confidence sets) for $M$-estimands is obtained by proving asymptotic normality of $M$-estimators centered at the target. However, asymptotic normality is only one of several possible limiting distributions and (asymptotically) valid inference becomes significantly difficult with non-normal limits. In this paper, we provide conditions for the symmetry of three general classes of limiting distributions, enabling inference using HulC (Kuchibhotla et al. (2024)).
翻译:暂无翻译