In this paper we prove the first quantitative convergence rates for the graph infinity Laplace equation for length scales at the connectivity threshold. In the graph-based semi-supervised learning community this equation is also known as Lipschitz learning. The graph infinity Laplace equation is characterized by the metric on the underlying space, and convergence rates follow from convergence rates for graph distances. At the connectivity threshold, this problem is related to Euclidean first passage percolation, which is concerned with the Euclidean distance function $d_{h}(x,y)$ on a homogeneous Poisson point process on $\mathbb{R}^d$, where admissible paths have step size at most $h>0$. Using a suitable regularization of the distance function and subadditivity we prove that ${d_{h_s}(0,se_1)}/ s \to \sigma$ as $s\to\infty$ almost surely where $\sigma \geq 1$ is a dimensional constant and $h_s\gtrsim \log(s)^\frac{1}{d}$. A convergence rate is not available due to a lack of approximate superadditivity when $h_s\to \infty$. Instead, we prove convergence rates for the ratio $\frac{d_{h}(0,se_1)}{d_{h}(0,2se_1)}\to \frac{1}{2}$ when $h$ is frozen and does not depend on $s$. Combining this with the techniques that we developed in (Bungert, Calder, Roith, IMA Journal of Numerical Analysis, 2022), we show that this notion of ratio convergence is sufficient to establish uniform convergence rates for solutions of the graph infinity Laplace equation at percolation length scales.
翻译:在本文中, 我们证明了图形的最小性 Laplace 方程式在连接临界值的长度比例上的首次量化趋同率。 在基于图形的半监督的半监督学习社区中, 这个方程式也被称为 Lipschitz 学习。 图形的最小性拉place方程式的特征是基础空间的衡量标准, 以及图形距离的趋同率。 在连接临界值中, 这个问题与 Euclidean 第一次通道穿透率有关, 与 Euclidean 距离函数 $d ⁇ h} (x,y) 有关。 在 $\ geq} (x,y) 在基于图形的 Poisson点进程, $\thbb{R ⁇ d$, 允许路径的步数大小最多为$>0.0。 使用适当的远程函数和子相加宽度的组合率, 我们证明美元==========lational =xx 美元, i========xxxxxxxxxxlal=xxxxx =xxxxxxxxxxxxxxxxxxxxxxx=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx