Weighted minwise hashing is a standard dimensionality reduction technique with applications to similarity search and large-scale kernel machines. We introduce a simple algorithm that takes a weighted set $x \in \mathbb{R}_{\geq 0}^{d}$ and computes $k$ independent minhashes in expected time $O(k \log k + \Vert x \Vert_{0}\log( \Vert x \Vert_1 + 1/\Vert x \Vert_1))$, improving upon the state-of-the-art BagMinHash algorithm (KDD '18) and representing the fastest weighted minhash algorithm for sparse data. Our experiments show running times that scale better with $k$ and $\Vert x \Vert_0$ compared to ICWS (ICDM '10) and BagMinhash, obtaining $10$x speedups in common use cases. Our approach also gives rise to a technique for computing fully independent locality-sensitive hash values for $(L, K)$-parameterized approximate near neighbor search under weighted Jaccard similarity in optimal expected time $O(LK + \Vert x \Vert_0)$, improving on prior work even in the case of unweighted sets.
翻译:(k k\ log k +\ Vert ⁇ 0 ⁇ log (\ vert x\ Vert_1+ 1/\ Vert_1) 是一种标准维度减少技术,应用到相似的搜索和大型内核机器。我们引入了一种简单的算法,在 mathbbb{R ⁇ geq 0 ⁇ d}\ mathb{$ 美元 和计算在预期时间 $k\ log k +\ vert}\ Vert ⁇ 0 ⁇ log (\ Vert_1 + 1/\ Vert x\ Vert_1) 使用。我们引入了一个简单的算法,它使用一个加权的设置 $x $x 和 0. Vert+\ Vert x 0 美元, 计算完全独立的本地敏感值的技术, 并代表着最分散数据的最快速的最小的 minhash 算法 。 我们的实验显示比ICWSWS( 10) 和 Bagminhnishash (IM_ Rockal-parater) 在最接近的RigalalalalalalxxxxxxxxxICardxxIQal- machmal- progrogardxxxxxxxxxxxxxxxxxxxxxIGMQal-cal-cal-parmacal-paraxxxxxxxxxxxxxxxxxxxxxxx