We introduce the following natural generalization of trace reconstruction, parameterized by a deletion probability $\delta \in (0,1)$ and length $n$: There is a length $n$ string of probabilities, $S=p_1,\ldots,p_n,$ and each "trace" is obtained by 1) sampling a length $n$ binary string whose $i$th coordinate is independently set to 1 with probability $p_i$ and 0 otherwise, and then 2) deleting each of the binary values independently with probability $\delta$, and returning the corresponding binary string of length $\le n$. The goal is to recover an estimate of $S$ from a set of independently drawn traces. In the case that all $p_i \in \{0,1\}$ this is the standard trace reconstruction problem. We show two complementary results. First, for worst-case strings $S$ and any deletion probability at least order $1/\sqrt{n}$, no algorithm can approximate $S$ to constant $\ell_\infty$ distance or $\ell_1$ distance $o(\sqrt n)$ using fewer than $2^{\Omega(\sqrt{n})}$ traces. Second -- as in the case for standard trace reconstruction -- reconstruction is easy for random $S$: for any sufficiently small constant deletion probability, and any $\epsilon>0$, drawing each $p_i$ independently from the uniform distribution over $[0,1]$, with high probability $S$ can be recovered to $\ell_1$ error $\epsilon$ using $\mathrm{poly}(n,1/\epsilon)$ traces and computation time. We show indistinguishability in our lower bound by regarding a complicated alternating sum (comparing two distributions) as the Fourier transformation of some function evaluated at $\pm \pi,$ and then showing that the Fourier transform decays rapidly away from zero by analyzing its moment generating function.
翻译:暂无翻译