We present an $O(1)$-round fully-scalable deterministic massively parallel algorithm for computing the min-plus matrix multiplication of unit-Monge matrices. We use this to derive a $O(\log n)$-round fully-scalable massively parallel algorithm for solving the exact longest increasing subsequence (LIS) problem. For a fully-scalable MPC regime, this result substantially improves the previously known algorithm of $O(\log^4 n)$-round complexity, and matches the best algorithm for computing the $(1+\epsilon)$-approximation of LIS.
翻译:暂无翻译