Hydrological post-processing using quantile regression algorithms constitutes a prime means of estimating the uncertainty of hydrological predictions. Nonetheless, conventional large-sample theory for quantile regression does not apply sufficiently far in the tails of the probability distribution of the dependent variable. To overcome this limitation that could be crucial when the interest lies on flood events, hydrological post-processing through extremal quantile regression is introduced here for estimating the extreme quantiles of hydrological model's responses. In summary, the new hydrological post-processing method exploits properties of the Hill's estimator from the extreme value theory to extrapolate quantile regression's predictions to high quantiles. As a proof of concept, the new method is here tested in post-processing daily streamflow simulations provided by three process-based hydrological models for 180 basins in the contiguous United States (CONUS) and is further compared to conventional quantile regression. With this large-scale comparison, it is demonstrated that hydrological post-processing using conventional quantile regression severely underestimates high quantiles (at the quantile level 0.9999) compared to hydrological post-processing using extremal quantile regression, although both methods are equivalent at lower quantiles (at the quantile level 0.9700). Moreover, it is shown that, in the same context, extremal quantile regression estimates the high predictive quantiles with efficiency that is, on average, equivalent in the large-sample study for the three process-based hydrological models.
翻译:利用微量回归算法进行水文后处理是估算水文预测不确定性的主要手段。然而,传统的微量回归大抽样理论在依赖变量的概率分布的尾部应用得不够远。为了克服这一限制,如果对洪水事件感兴趣,则通过微量回归算法进行水文后处理,以极端量回归法来估计水文模型反应的极端数量。总而言之,新的水文后处理法利用希尔的估测器从极端价值理论到外推微量回归预测到高量回归的特性。尽管如此,作为概念的证明,新的方法在这里测试了由美国毗连地区180个盆地的三个基于流程的水文模型提供的后处理后的每日流流模拟,并进一步与传统的量回归相比。通过这种大规模比较,可以证明,利用传统的孔位回归率回归法对高四分数进行严重低估(在微量理论水平上为0.9,降量回归率预测值的预测值的预测值为高。 相对于高量水平的数值分析,在高量回归率水平上显示的是,高水平的里程中,最高水平的里程数级分析为平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平平