Multidimensional scaling (MDS) is the act of embedding proximity information about a set of $n$ objects in $d$-dimensional Euclidean space. As originally conceived by the psychometric community, MDS was concerned with embedding a fixed set of proximities associated with a fixed set of objects. Modern concerns, e.g., that arise in developing asymptotic theories for statistical inference on random graphs, more typically involve studying the limiting behavior of a sequence of proximities associated with an increasing set of objects. Here we are concerned with embedding dissimilarities by minimizing Kruskal's (1964) raw stress criterion. Standard results from the theory of point-to-set maps can be used to establish that, if $n$ is fixed and a sequence of dissimilarity matrices converges, then the limit of their embedded structures is the embedded structure of the limiting dissimilarity matrix. But what if $n$ increases? It then becomes necessary to reformulate MDS so that the entire sequence of embedding problems can be viewed as a sequence of optimization problems in a fixed space. We present such a reformulation, {\em continuous MDS}. Within the continuous MDS framework, we derive two $L^p$ consistency results, one for embedding without constraints on the configuration, the other for embedding subject to {\em approximate Lipschitz constraints}\/ that encourage smoothness of the embedding function. The latter approach, {\em Approximate Lipschitz Embedding}\/ (ALE) is new. Finally, we demonstrate that embedded structures produced by ALE can be interpolated in a way that results in uniform convergence.
翻译:暂无翻译