We consider the k-diameter clustering problem, where the goal is to partition a set of points in a metric space into $k$ clusters, minimizing the maximum distance between any two points in the same cluster. In general metrics, k-diameter is known to be NP-hard, while it has a $2$-approximation algorithm (Gonzalez'85). Complementing this algorithm, it is known that k-diameter is NP-hard to approximate within a factor better than $2$ in the $\ell_1$ and $\ell_\infty$ metrics, and within a factor of $1.969$ in the $\ell_2$ metric (Feder-Greene'88). When $k\geq 3$ is fixed, k-diameter remains NP-hard to approximate within a factor better than $2$ in the $\ell_\infty$ metric (Megiddo'90). However, its approximability in this setting has not previously been studied in the $\ell_1$ and $\ell_2$ metrics, though a $1.415$-approximation algorithm in the $\ell_2$ metric follows from a known result (Badoiu et al.'02). In this paper, we address the remaining gap by showing new hardness of approximation results that hold even when $k=3$. Specifically, we prove that 3-diameter is NP-hard to approximate within a factor better than $1.5$ in the $\ell_1$ metric, and within a factor of $1.304$ in the $\ell_2$ metric.
翻译:暂无翻译