source coding theorem

E624506

The source coding theorem is a fundamental result in information theory that establishes the minimum average number of bits needed to losslessly encode symbols from a given information source, linking this limit to the source’s entropy.

Try in SPARQL Jump to: Statements Referenced by

Statements (44)

Predicate Object
instanceOf theorem in information theory
alsoKnownAs Shannon source coding theorem NERFINISHED
noiseless coding theorem NERFINISHED
appliesTo discrete memoryless sources
stationary ergodic sources
assumes large block length coding
probabilistic model of the source
category fundamental theorem of information theory
concerns noiseless communication
contrastsWith channel coding theorem
field information theory
formulatedBy Claude E. Shannon NERFINISHED
goal minimize average number of bits per symbol for lossless representation
guarantees existence of asymptotically optimal codes
hasApplication coding for storage systems
entropy coding in multimedia standards
file compression
hasConsequence Huffman coding is optimal among prefix codes for a given source
no lossless code can have average rate below the source entropy
universal codes aim to approach the entropy without full source knowledge
implies entropy is the fundamental limit of lossless compression
influences design of practical compression algorithms
entropy coding methods
rate–distortion theory NERFINISHED
isPartOf Shannon’s information theory NERFINISHED
mathematicalForm L̄ ≥ H(X) for any uniquely decodable code
for every ε > 0 there exists a code with L̄ < H(X) + ε
publishedIn A Mathematical Theory of Communication NERFINISHED
relatesConcept average codeword length
data compression
entropy
lossless compression
optimal coding
prefix codes
requires prefix-free or instantaneous codes for practical realization
uniquely decodable codes
statesThat for any uniquely decodable code the average codeword length is at least the source entropy
the minimum achievable average codeword length per source symbol is lower bounded by the entropy of the source
there exist codes whose average codeword length is arbitrarily close to the source entropy
upperBoundGivenBy H(X)+1 for optimal prefix codes of a discrete memoryless source
usesConcept Kraft–McMillan inequality NERFINISHED
law of large numbers
typical sequences
yearProposed 1948

Referenced by (1)

Full triples — surface form annotated when it differs from this entity's canonical label.

information theory hasCoreConcept source coding theorem