In this paper we introduce Farthest Sampling Segmentation (FSS), a new method for segmentation of triangulated surfaces, which consists of two fundamental steps: the computation of a submatrix $W^k$ of the affinity matrix $W$ and the application of the k-means clustering algorithm to the rows of $W^k$. The submatrix $W^k$ is obtained computing the affinity between all triangles and only a few special triangles: those which are farthest in the defined metric. This is equivalent to select a sample of columns of $W$ without constructing it completely. The proposed method is computationally cheaper than other segmentation algorithms, since it only calculates few columns of $W$ and it does not require the eigendecomposition of $W$ or of any submatrix of $W$. We prove that the orthogonal projection of $W$ on the space generated by the columns of $W^k$ coincides with the orthogonal projection of $W$ on the space generated by the $k$ eigenvectors computed by Nystr\"om's method using the columns of $W^k$ as a sample of $W$. Further, it is shown that for increasing size $k$, the proximity relationship among the rows of $W^k$ tends to faithfully reflect the proximity among the corresponding rows of $W$. The FSS method does not depend on parameters that must be tuned by hand and it is very flexible, since it can handle any metric to define the distance between triangles. Numerical experiments with several metrics and a large variety of 3D triangular meshes show that the segmentations obtained computing less than the 10% of columns $W$ are as good as those obtained from clustering the rows of the full matrix $W$.
翻译:在本文中,我们引入了最远的取样分割三角表面的新方法(FSS),这是一种三角形表面的新的分解方法,它由两个基本步骤组成:计算亲和矩阵的基价$Wk美元和对行应用 k-bus 组合算法$Wk美元。亚基调 $Wk美元是计算所有三角和几个特殊三角之间的亲和度的偏差:在定义的参数中,离差最远。这相当于选择美元和美元之间的一列样本。拟议方法比其他分解算法计算得更便宜,因为它只计算很少一列Wk美元,而且它不需要计算美元或美元的任何亚基值。我们证明,在所有三角和几个特殊三角之间,在由 $Wk 的列产生的空间上,在美元和美元之间, 美元之间的直径值是 $W 美元, 直径的直径直径, 以Nysralxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx