当前位置: X-MOL 学术Found. Comput. Math. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Continuum Limit of Lipschitz Learning on Graphs
Foundations of Computational Mathematics ( IF 3 ) Pub Date : 2022-01-21 , DOI: 10.1007/s10208-022-09557-9
Tim Roith 1 , Leon Bungert 2
Affiliation  

Tackling semi-supervised learning problems with graph-based methods has become a trend in recent years since graphs can represent all kinds of data and provide a suitable framework for studying continuum limits, for example, of differential operators. A popular strategy here is p-Laplacian learning, which poses a smoothness condition on the sought inference function on the set of unlabeled data. For \(p<\infty \) continuum limits of this approach were studied using tools from \(\varGamma \)-convergence. For the case \(p=\infty \), which is referred to as Lipschitz learning, continuum limits of the related infinity Laplacian equation were studied using the concept of viscosity solutions. In this work, we prove continuum limits of Lipschitz learning using \(\varGamma \)-convergence. In particular, we define a sequence of functionals which approximate the largest local Lipschitz constant of a graph function and prove \(\varGamma \)-convergence in the \(L^{\infty }\)-topology to the supremum norm of the gradient as the graph becomes denser. Furthermore, we show compactness of the functionals which implies convergence of minimizers. In our analysis we allow a varying set of labeled data which converges to a general closed set in the Hausdorff distance. We apply our results to nonlinear ground states, i.e., minimizers with constrained \(L^p\)-norm, and, as a by-product, prove convergence of graph distance functions to geodesic distance functions.



中文翻译:

图上 Lipschitz 学习的连续极限

近年来,使用基于图的方法解决半监督学习问题已成为一种趋势,因为图可以表示各种数据并为研究连续统极限(例如微分算子)提供合适的框架。这里一种流行的策略是p-拉普拉斯学习,它对未标记数据集上所寻求的推理函数提出了一个平滑条件。对于\(p<\infty \) ,使用来自\(\varGamma \) -convergence的工具研究了这种方法的连续极限。对于这种情况\(p=\infty \),这被称为 Lipschitz 学习,使用粘度解的概念研究了相关无限拉普拉斯方程的连续极限。在这项工作中,我们使用\(\varGamma \) -convergence证明了 Lipschitz 学习的连续极限。特别是,我们定义了一个近似图函数的最大局部 Lipschitz 常数的泛函序列,并证明了在\(L^{\infty }\)中的\(\varGamma \) -convergence-随着图变得更密集,梯度的上范数的拓扑。此外,我们展示了函数的紧凑性,这意味着最小化器的收敛。在我们的分析中,我们允许一组不同的标记数据收敛到 Hausdorff 距离中的一般闭集。我们将我们的结果应用于非线性基态,即具有约束\(L^p\)范数的最小化器,并且作为副产品,证明了图距离函数到测地距离函数的收敛性。

更新日期:2022-01-23
down
wechat
bug