We conduct a KL-divergence based procedure for testing elliptical distributions. The procedure simultaneously takes into account the two defining properties of an elliptically distributed random vector: independence between length and direction, and uniform distribution of the direction. The test statistic is constructed based on the $k$ nearest neighbors ($k$NN) method, and two cases are considered where the mean vector and covariance matrix are known and unknown. First-order asymptotic properties of the test statistic are rigorously established by creatively utilizing sample splitting, truncation and transformation between Euclidean space and unit sphere, while avoiding assuming Fr\'echet differentiability of any functionals. Debiasing and variance inflation are further proposed to treat the degeneration of the influence function. Numerical implementations suggest better size and power performance than the state of the art procedures.
 翻译:暂无翻译