noisy-channel coding theorem
E624507
The noisy-channel coding theorem is a fundamental result in information theory that establishes the maximum rate at which information can be transmitted over a noisy communication channel with arbitrarily low error using appropriate encoding schemes.
Statements (45)
| Predicate | Object |
|---|---|
| instanceOf |
result in information theory
ⓘ
theorem ⓘ |
| alsoKnownAs |
Shannon coding theorem
NERFINISHED
ⓘ
channel coding theorem NERFINISHED ⓘ |
| appliesTo |
additive white Gaussian noise channels
ⓘ
discrete memoryless channels ⓘ |
| assumes |
discrete memoryless channel
ⓘ
probabilistic channel model ⓘ sufficiently long block length ⓘ |
| characterizes | maximum achievable reliable communication rate ⓘ |
| defines | channel capacity ⓘ |
| doesNotSpecify | explicit construction of optimal codes ⓘ |
| field | information theory ⓘ |
| formulatedBy | Claude E. Shannon NERFINISHED ⓘ |
| foundationFor | modern digital communication theory ⓘ |
| hasConverse |
strong converse for channel coding
ⓘ
weak converse for channel coding ⓘ |
| hasVariant |
Gaussian channel coding theorem
NERFINISHED
ⓘ
continuous-time version ⓘ |
| historicalSignificance | cornerstone of Shannon’s information theory ⓘ |
| implies |
existence of long block codes with low error probability
ⓘ
trade-off between rate and reliability ⓘ |
| influencedField |
coding theory
ⓘ
data compression theory ⓘ digital communications ⓘ network information theory NERFINISHED ⓘ |
| inspired |
development of coding theory
ⓘ
development of error-correcting codes ⓘ |
| involvesQuantity |
channel transition probabilities
ⓘ
input distribution ⓘ mutual information between input and output ⓘ |
| mathematicallyExpresses | channel capacity as maximum mutual information over input distributions ⓘ |
| publishedIn | A Mathematical Theory of Communication NERFINISHED ⓘ |
| relatesConcept |
coding rate
ⓘ
entropy ⓘ error probability ⓘ mutual information ⓘ |
| statesThat |
for any rate below channel capacity there exist codes with arbitrarily small error probability
ⓘ
reliable communication over a noisy channel is possible if and only if the transmission rate is less than channel capacity ⓘ |
| usedIn |
analysis of data transmission limits
ⓘ
design of communication systems ⓘ |
| usesConcept |
asymptotic equipartition property
ⓘ
random coding argument ⓘ typical sequences ⓘ |
| yearProposed | 1948 ⓘ |
Referenced by (1)
Full triples — surface form annotated when it differs from this entity's canonical label.