The ''trace reconstruction'' problem asks, given an unknown binary string $x$ and a channel that repeatedly returns ''traces'' of $x$ with each bit randomly deleted with some probability $p$, how many traces are needed to recover $x$? There is an exponential gap between the best known upper and lower bounds for this problem. Many variants of the model have been introduced in hopes of motivating or revealing new approaches to narrow this gap. We study the variant of circular trace reconstruction introduced by Narayanan and Ren (ITCS 2021), in which traces undergo a random cyclic shift in addition to random deletions. We show an improved lower bound of $\tildeΩ(n^5)$ for circular trace reconstruction. This contrasts with the (previously) best known lower bounds of $\tildeΩ(n^3)$ in the circular case and $\tildeΩ(n^{3/2})$ in the linear case. Our bound shows the indistinguishability of traces from two sparse strings $x,y$ that each have a constant number of nonzeros. Can this technique be extended significantly? How hard is it to reconstruct a sparse string $x$ under a cyclic deletion channel? We resolve these questions by showing, using Fourier techniques, that $\tilde{O}(n^6)$ traces suffice for reconstructing any constant-sparse string in a circular deletion channel, in contrast to the upper bound of $\exp(\tilde{O}(n^{1/3}))$ for general strings in the circular deletion channel. This shows that new algorithms or new lower bounds must focus on non-constant-sparse strings.
翻译:暂无翻译