The notion of Las Vegas algorithm was introduced by Babai (1979) and may be defined in two ways: * In Babai's original definition, a randomized algorithm is called Las Vegas if it has finitely bounded running time and certifiable random failure. * Alternatively, in a widely accepted definition today, Las Vegas algorithms mean the zero-error randomized algorithms with random running time. The equivalence between the two definitions is straightforward. In particular, by repeatedly running the algorithm until no failure encountered, one can simulate the correct output of a successful running. We show that this can also be achieved for distributed local computation. Specifically, we show that in the LOCAL model, any Las Vegas algorithm that terminates in finite time with locally certifiable failures, can be converted to a zero-error Las Vegas algorithm, at a polylogarithmic cost in the time complexity, such that the resulting algorithm perfectly simulates the output of the original algorithm on the same instance conditioned on that the algorithm successfully returns without failure.
翻译:暂无翻译