We present several generative and predictive algorithms based on the RKHS (reproducing kernel Hilbert spaces) methodology, which, most importantly, are scale up efficiently with large datasets or high-dimensional data. It is well recognized that the RKHS methodology leads one to efficient and robust algorithms for numerous tasks in data science, statistics, and scientific computation. However, the implementations existing the literature are often difficult to scale up for encompassing large datasets. In this paper, we introduce a simple and robust, divide-and-conquer methodology. It applies to large scale datasets and relies on several kernel-based algorithms, which distinguish between various extrapolation, interpolation, and optimal transport steps. We argue how to select the suitable algorithm in specific applications thanks to a feedback of performance criteria. Our primary focus is on applications and problems arising in industrial contexts, such as generating meshes for efficient numerical simulations, designing generators for conditional distributions, constructing transition probability matrices for statistical or stochastic applications, and addressing various tasks relevant to the Artificial Intelligence community. The proposed algorithms are highly relevant to supervised and unsupervised learning, generative methods, as well as reinforcement learning.
翻译:暂无翻译