In this article, we investigate the problem of estimating a spatially inhomogeneous function and its derivatives in the white noise model using Besov-Laplace priors. We show that smoothness-matching priors attains minimax optimal posterior contraction rates, in strong Sobolev metrics, over the Besov spaces $B^\beta_{11}$, $\beta > d/2$, closing a gap in the existing literature. Our strong posterior contraction rates also imply that the posterior distributions arising from Besov-Laplace priors with matching regularity enjoy a desirable plug-in property for derivative estimation, entailing that the push-forward measures under differential operators optimally recover the derivatives of the unknown regression function. The proof of our results relies on the novel approach to posterior contraction rates, based on Wasserstein distance, recently developed by Dolera, Favaro and Mainini (Probability Theory and Related Fields, 2024). We show how this approach allows to overcome some technical challenges that emerge in the frequentist analysis of smoothness-matching Besov-Laplace priors.
翻译:暂无翻译