For any finite discrete source, the competitive advantage of prefix code $C_1$ over prefix code $C_2$ is the probability $C_1$ produces a shorter codeword than $C_2$, minus the probability $C_2$ produces a shorter codeword than $C_1$. For any source, a prefix code is competitively optimal if it has a nonnegative competitive advantage over all other prefix codes. In 1991, Cover proved that Huffman codes are competitively optimal for all dyadic sources. We prove the following asymptotic converse: As the source size grows, the probability a Huffman code for a randomly chosen non-dyadic source is competitively optimal converges to zero. We also prove: (i) For any source, competitively optimal codes cannot exist unless a Huffman code is competitively optimal; (ii) For any non-dyadic source, a Huffman code has a positive competitive advantage over a Shannon-Fano code; (iii) For any source, the competitive advantage of any prefix code over a Huffman code is strictly less than $\frac{1}{3}$; (iv) For each integer $n>3$, there exists a source of size $n$ and some prefix code whose competitive advantage over a Huffman code is arbitrarily close to $\frac{1}{3}$; and (v) For each positive integer $n$, there exists a source of size $n$ and some prefix code whose competitive advantage over a Shannon-Fano code becomes arbitrarily close to $1$ as $n\longrightarrow\infty$.
翻译:暂无翻译