We study a hypothesis testing problem in the context of high-dimensional changepoint detection. Given a matrix $X \in \mathbb{R}^{p \times n}$ with independent Gaussian entries, the goal is to determine whether or not a sparse, non-null fraction of rows in $X$ exhibits a shift in mean at a common index between $1$ and $n$. We focus on three aspects of this problem: the sparsity of non-null rows, the presence of a single, common changepoint in the non-null rows, and the signal strength associated with the changepoint. Within an asymptotic regime relating the data dimensions $n$ and $p$ to the signal sparsity and strength, we characterize the information-theoretic limits of the testing problem by a formula that determines whether the sum of Type I and II errors tends to zero or is bounded away from zero. The formula, called the \emph{detection boundary}, is a curve that separates the parameter space into a detectable region and an undetectable region. We show that a Berk--Jones type test statistic can detect the presence of a sparse non-null fraction of rows, and does so adaptively throughout the detectable region. Conversely, within the undetectable region, no test is able to consistently distinguish the signal from noise.
翻译:暂无翻译