Skip to main content

Essay: The Zero Error Capacity of a Noisy Channel

Focus and novelty
Claude Shannon isolates the problem of transmitting information with absolutely no possibility of error over a noisy discrete memoryless channel. Unlike ordinary capacity, which tolerates vanishing error probability and depends on transition probabilities, the zero-error question depends only on which input-output pairs can occur at all. The core insight is to recast zero-error communication as a combinatorial problem on graphs, turning coding questions into questions about independence, coloring, and products of graphs.

Confusability and the graph model
A channel is represented by a bipartite graph linking inputs to outputs whenever the transition probability is nonzero. From this, Shannon constructs the confusability graph on the input alphabet: two inputs are adjacent if there exists at least one output that both can produce. A zero-error code is then a set of inputs with no confusable pairs, i.e., an independent set in this graph. For block length n, confusability is determined componentwise, leading to the strong graph product G^n of the single-letter confusability graph G. The largest zero-error code of length n has size M(n) = α(G^n), where α denotes the independence number.

Defining zero-error capacity
Zero-error capacity C0 is the exponential growth rate of the largest zero-error code size, C0 = lim_{n→∞} (1/n) log M(n). Using the supermultiplicativity α(G^{m+n}) ≥ α(G^m) α(G^n), the limit exists and equals sup_n (1/n) log α(G^n). Equivalently, with c(G) = sup_n α(G^n)^{1/n}, one has C0 = log c(G). This separates sharply from ordinary capacity: two channels with identical confusability graphs share the same C0 even if their probabilities differ, while ordinary capacity can vary with probabilities.

Bounds and illustrative channels
Shannon derives upper and lower bounds on C0 through combinatorial parameters of G and its powers, using independent sets for achievability and clique covers or colorings for converse bounds. Extremal cases sharpen intuition. A noiseless channel has G with no edges, so α(G^n) = |X|^n and C0 equals log |X|. By contrast, if every pair of inputs can yield a common output (as in a binary symmetric or erasure channel), G is complete and α(G^n) = 1, giving C0 = 0. More subtle behavior appears in “typewriter” channels where each symbol can be mistaken only for its neighbors in a cycle. For the 5-cycle, Shannon exhibits codes showing α(G^2) = 5, which yields a positive lower bound strictly below 1 bit per use; at the time the exact value was unknown, highlighting the problem’s combinatorial depth.

Feedback and variable-length coding
Allowing noiseless feedback does not change ordinary capacity but can increase zero-error capacity. Shannon formulates zero-error communication with feedback as an adaptive decision process that prunes the set of possible inputs consistent with the observed outputs. He provides general bounds and examples where feedback strictly raises the zero-error rate, including cases where C0 without feedback is zero yet becomes positive with feedback. Variable-length strategies with decision trees can be essential to achieve the best zero-error rates under feedback.

Significance
The paper inaugurates the graph-theoretic study of information transmission without errors, introducing the confusability graph, graph products, and the graph capacity now called the Shannon capacity. It shows that zero-error communication hinges on the combinatorial structure of which confusions are possible, not their likelihood. The examples and bounds lay out a program that later work would deepen, while already demonstrating that even tiny channels can lead to challenging and surprising capacity phenomena.
The Zero Error Capacity of a Noisy Channel

Introduces the concept of zero-error capacity for communication channels and uses graph-theoretic methods to analyze channel behavior when no errors are permitted, expanding the mathematical scope of information theory.


Author: Claude Shannon

Claude Shannon Claude Shannon, the father of information theory whose innovations laid the foundation for today's digital age.
More about Claude Shannon