This paper investigates the asymptotics of the maximal throughput of communication over AWGN channels by $n$ channel uses under a covert constraint in terms of an upper bound $\delta$ of Kullback-Leibler divergence (KL divergence). It is shown that the first and second order asymptotics of the maximal throughput are $\sqrt{n\delta \log e}$ and $(2)^{1/2}(n\delta)^{1/4}(\log e)^{3/4}\cdot Q^{-1}(\epsilon)$, respectively. The technique we use in the achievability is quasi-$\varepsilon$-neighborhood notion from information geometry. We prove that if the generating distribution of the codebook is close to Dirac measure in the weak sense, then the corresponding output distribution at the adversary satisfies covert constraint in terms of most common divergences. This helps link the local differential geometry of the distribution of noise with covert constraint. For the converse, the optimality of Gaussian distribution for minimizing KL divergence under second order moment constraint is extended from dimension $1$ to dimension $n$. It helps to establish the upper bound on the average power of the code to satisfy the covert constraint, which further leads to the direct converse bound in terms of covert metric.
翻译:暂无翻译