We propose a novel Bayesian methodology for inference in functional linear and logistic regression models based on the theory of reproducing kernel Hilbert spaces (RKHS's). We introduce general models that build upon the RKHS generated by the covariance function of the underlying stochastic process, and whose formulation includes as particular cases all finite-dimensional models based on linear combinations of marginals of the process, which can collectively be seen as a dense subspace made of simple approximations. By imposing a suitable prior distribution on this dense functional space we can perform data-driven inference via standard Bayes methodology, estimating the posterior distribution through reversible jump Markov chain Monte Carlo methods. In this context, our contribution is two-fold. First, we derive a theoretical result that guarantees posterior consistency, based on an application of a classic theorem of Doob to our RKHS setting. Second, we show that several prediction strategies stemming from our Bayesian procedure are competitive against other usual alternatives in both simulations and real data sets, including a Bayesian-motivated variable selection method.
翻译:暂无翻译