Quantifying the risks of extreme scenarios requires understanding the tail behaviours of variables of interest. While the tails of individual variables can be characterized parametrically, the extremal dependence across variables can be complex and its modeling remains one of the core problems in extreme value analysis. Notably, existing measures for extremal dependence, such as angular components and spectral random vectors, reside on nonlinear supports, such that statistical models and methods designed for linear vector spaces cannot be readily applied. In this paper, we show that the extremal dependence of $d$ asymptotically dependent variables can be characterized by a class of random vectors residing on a $(d-1)$-dimensional hyperplane. This translates the analyses of multivariate extremes to that on a linear vector space, opening up the potentials for the application of existing statistical techniques, particularly in statistical learning and dimension reduction. As an example, we show that a lower-dimensional approximation of multivariate extremes can be achieved through principal component analysis on the hyperplane. Additionally, through this framework, the widely used H\"usler-Reiss family for modelling extremes is characterized by the Gaussian family residing on the hyperplane, thereby justifying its status as the Gaussian counterpart for extremes.
翻译:暂无翻译