A distance estimator for a graph property $\mathcal{P}$ is an algorithm that given $G$ and $\alpha, \varepsilon >0$ distinguishes between the case that $G$ is $(\alpha-\varepsilon)$-close to $\mathcal{P}$ and the case that $G$ is $\alpha$-far from $\mathcal{P}$ (in edit distance). We say that $\mathcal{P}$ is estimable if it has a distance estimator whose query complexity depends only on $\varepsilon$. Every estimable property is also testable, since testing corresponds to estimating with $\alpha=\varepsilon$. A central result in the area of property testing, the Fischer--Newman theorem, gives an inverse statement: every testable property is in fact estimable. The proof of Fischer and Newman was highly ineffective, since it incurred a tower-type loss when transforming a testing algorithm for $\mathcal{P}$ into a distance estimator. This raised the natural problem, studied recently by Fiat--Ron and by Hoppen--Kohayakawa--Lang--Lefmann--Stagni, whether one can find a transformation with a polynomial loss. We obtain the following results. 1. If $\mathcal{P}$ is hereditary, then one can turn a tester for $\mathcal{P}$ into a distance estimator with an exponential loss. This is an exponential improvement over the result of Hoppen et. al., who obtained a transformation with a double exponential loss. 2. For every $\mathcal{P}$, one can turn a testing algorithm for $\mathcal{P}$ into a distance estimator with a double exponential loss. This improves over the transformation of Fischer--Newman that incurred a tower-type loss. Our main conceptual contribution in this work is that we manage to turn the approach of Fischer--Newman, which was inherently ineffective, into an efficient one. On the technical level, our main contribution is in establishing certain properties of Frieze--Kannan Weak Regular partitions that are of independent interest.
翻译:暂无翻译