The weighted Euclidean norm $\|x\|_w$ of a vector $x\in \mathbb{R}^d$ with weights $w\in \mathbb{R}^d$ is the Euclidean norm where the contribution of each dimension is scaled by a given weight. Approaches to dimensionality reduction that satisfy the Johnson-Lindenstrauss (JL) lemma can be easily adapted to the weighted Euclidean distance if weights are fixed: it suffices to scale each dimension of the input vectors according to the weights, and then apply any standard approach. However, this is not the case when weights are unknown during the dimensionality reduction or might dynamically change. We address this issue by providing an approach that maps vectors into a smaller complex vector space, but still allows to satisfy a JL-like property for the weighted Euclidean distance when weights are revealed. Specifically, let $\Delta\geq 1, \epsilon \in (0,1)$ be arbitrary values, and let $S\subset \mathbb{R}^d$ be a set of $n$ vectors. We provide a weight-oblivious linear map $g:\mathbb{R}^d \rightarrow \mathbb{C}^k$, with $k=\Theta(\epsilon^{-2}\Delta^4 \ln{n})$, to reduce vectors in $S$, and an estimator $\rho: \mathbb{C}^k \times \mathbb{R}^d \rightarrow \mathbb R$ with the following property. For any $x\in S$, the value $\rho(g(x), w)$ is an unbiased estimate of $\|x\|^2_w$, and $\rho$ is computed from the reduced vector $g(x)$ and the weights $w$. Moreover, the error of the estimate $\rho((g(x), w)$ depends on the norm distortion due to weights and parameter $\Delta$: for any $x\in S$, the estimate has a multiplicative error $\epsilon$ if $\|x\|_2\|w\|_2/\|x\|_w\leq \Delta$, otherwise the estimate has an additive $\epsilon \|x\|^2_2\|w\|^2_2/\Delta^2$ error. Finally, we consider the estimation of weighted Euclidean norms in streaming settings: we show how to estimate the weighted norm when the weights are provided either after or concurrently with the input vector.
翻译:暂无翻译