We analyze the running time of Hartigan's method, an old algorithm for the $k$-means clustering problem. First, we construct an instance on the line on which the method can take $2^{\Omega(n)}$ steps to converge, demonstrating that Hartigan's method has exponential worst-case running time even when $k$-means is easy to solve. As this is in contrast to the empirical performance of the algorithm, we also analyze the running time in the framework of smoothed analysis. In particular, given an instance of $n$ points in $d$ dimensions, we prove that the expected number of iterations needed for Hartigan's method to terminate is bounded by $k^{12kd}\cdot \poly(n, k, d, 1/\sigma)$ when the points in the instance are perturbed by independent $d$-dimensional Gaussian random variables of mean $0$ and standard deviation $\sigma$.
翻译:暂无翻译