This paper investigates the asymptotics of the maximal throughput of communication over AWGN channels by $n$ channel uses under a covert constraint in terms of an upper bound $\delta$ of Kullback-Leibler divergence (KL divergence). It is shown that the first and second order asymptotics of the maximal throughput are $\sqrt{n\delta \log e}$ and $(2)^{1/2}(n\delta)^{1/4}(\log e)^{3/4}\cdot Q^{-1}(\epsilon)$, respectively. The technique we use in the achievability is quasi-$\varepsilon$-neighborhood notion from information geometry. For finite blocklength $n$, the generating distributions are chosen to be a family of truncated Gaussian distributions with decreasing variances. The law of decreasing is carefully designed so that it maximizes the throughput at the main channel in the asymptotic sense under the condition that the output distributions satisfy the covert constraint. For the converse, the optimality of Gaussian distribution for minimizing KL divergence under the second order moment constraint is extended from dimension $1$ to dimension $n$. Based on that, we establish an upper bound on the average power of the code to satisfy the covert constraint, which further leads to the direct converse bound in terms of covert metric.
翻译:暂无翻译