This paper investigates the asymptotic properties of least absolute deviation (LAD) regression for linear models with polynomial regressors, highlighting its robustness against heavy-tailed noise and outliers. Assuming independent and identically distributed (i.i.d.) errors, we establish the multiscale asymptotic normality of LAD estimators. A central result is the derivation of the asymptotic precision matrix, shown to be proportional to Hilbert matrices, with the proportionality coefficient depending on the asymptotic variance of the sample median of the noise distribution. We further explore the estimator's convergence properties, both in probability and almost surely, under varying model specifications. Through comprehensive simulations, we evaluate the speed of convergence of the LAD estimator and the empirical coverage probabilities of confidence intervals constructed under different scaling factors (T 1/2 and T $\alpha$ ). These experiments incorporate a range of noise distributions, including Laplace, Gaussian, and Cauchy, to demonstrate the estimator's robustness and efficiency. The findings underscore the versatility and practical relevance of LAD regression in handling non-standard data environments. By connecting the statistical properties of LAD estimators to classical mathematical structures, such as Hilbert matrices, this study offers both theoretical insights and practical tools for robust statistical modeling.
翻译:暂无翻译