Laplacian regularization is a popular smoothing technique in machine learning. It is particularly useful in situations where ambiguity of the data imposes the use of a criterion to disambiguate between potential functions explaining data, could it be spectral clustering or semi-supervised learning. While Laplacian regularization is usually approached through neighborhood graph diffusion, we present an approach through kernel methods, based on derivative evaluation maps. We derive an analytical solution of the empirical risk minimization with kernel Laplacian regularization. We prove strong consistency of our estimate when the number of data goes towards infinity. Moreover, we show that, under regularity assumptions, our kernel method bypasses the curse of dimensionality, hence providing a strong alternative to neighborhood graph methods that do not avoid it.
翻译:Laplacian 正规化是一种在机器学习中流行的平滑技术,在数据模糊要求使用一种标准来混淆解释数据的潜在功能时,它特别有用。在这种情况下,数据模糊要求使用一种标准来区分可能的功能,这可以是光谱集群或半监督的学习。虽然拉placian 正规化通常通过邻里图的分布方式进行,但我们以衍生物评估地图为基础,通过内核方法提出一种方法。我们用内核拉placian 正规化来得出一个实验风险最小化的分析解决方案。 当数据数量走向无限化时,我们证明我们的估算非常一致。 此外,我们表明,在常规假设下,我们的内核法绕过了维度的诅咒,因此为邻里图方法提供了一种强有力的替代方法,而这种方法并不避免。