Gaussian Processes and the Kullback-Leibler divergence have been deeply studied in Statistics and Machine Learning. This paper marries these two concepts and introduce the local Kullback-Leibler divergence to learn about intervals where two Gaussian Processes differ the most. We address subtleties entailed in the estimation of local divergences and the corresponding interval of local maximum divergence as well. The estimation performance and the numerical efficiency of the proposed method are showcased via a Monte Carlo simulation study. In a medical research context, we assess the potential of the devised tools in the analysis of electrocardiogram signals.
翻译:暂无翻译