Tackling semi-supervised learning problems with graph-based methods has become a trend in recent years since graphs can represent all kinds of data and provide a suitable framework for studying continuum limits, e.g., of differential operators. A popular strategy here is $p$-Laplacian learning, which poses a smoothness condition on the sought inference function on the set of unlabeled data. For $p<\infty$ continuum limits of this approach were studied using tools from $\Gamma$-convergence. For the case $p=\infty$, which is referred to as Lipschitz learning, continuum limits of the related infinity-Laplacian equation were studied using the concept of viscosity solutions. In this work, we prove continuum limits of Lipschitz learning using $\Gamma$-convergence. In particular, we define a sequence of functionals which approximate the largest local Lipschitz constant of a graph function and prove $\Gamma$-convergence in the $L^\infty$-topology to the supremum norm of the gradient as the graph becomes denser. Furthermore, we show compactness of the functionals which implies convergence of minimizers. In our analysis we allow a varying set of labeled data which converges to a general closed set in the Hausdorff distance. We apply our results to nonlinear ground states, i.e., minimizers with constrained $L^p$-norm, and, as a by-product, prove convergence of graph distance functions to geodesic distance functions.
翻译:以图形为基础的方法解决半监督的学习问题,近年来已成为一种趋势,因为图表可以代表所有类型的数据,并为研究连续限制提供一个合适的框架,例如差异操作员。这里流行的策略是美元-拉普拉西亚学习,这为在未贴标签的数据集中寻求的推断函数提供了一个平稳的条件。对于使用$\Gamma$-convergence的工具来研究这一方法的“半监督”连续限制。对于案件来说,$p ⁇ infty$,它被称为利普西茨学习,它为研究连续限制提供了合适的框架,例如,差异操作者。这里的流行策略是用美元-拉普拉帕西亚学习,这在未贴标签的数据集中,我们用$\Gamma$-converggence 来证明利普西茨学习的连续限制。对于使用当地最大lipschitz的图形常数, 并用$\Gammamamamam-convergence 来证明“Laltical-noralal”的“Lal-lifticalalalal” 函数的缩缩缩值, 以我们的缩缩缩缩缩图数据作为我们的缩定成成成的缩的缩的“Olistral-caldalmacal-lical-licalmas” 。