Slepian–Wolf coding theorem

E641830

The Slepian–Wolf coding theorem is a fundamental result in information theory that characterizes the limits of lossless data compression for correlated sources encoded separately but decoded jointly.

Try in SPARQL Jump to: Surface forms Statements Referenced by

Observed surface forms (1)

Surface form Occurrences
Slepian–Wolf bound 1

Statements (49)

Predicate Object
instanceOf coding theorem
information theory theorem
appliesTo discrete memoryless sources
stationary ergodic sources (with extensions)
assumes joint decoder
known joint distribution of sources at encoder and decoder (in classical formulation)
separate encoders
two or more correlated discrete memoryless sources
characterizes rate region for lossless compression of correlated sources
concerns correlated information sources
distributed source coding
lossless data compression
separate encoding and joint decoding
field information theory
generalizationOf lossless source coding to distributed encoders
guarantees arbitrarily small probability of decoding error for rates in achievable region
hasApplicationIn Wyner–Ziv coding NERFINISHED
compressing correlated data streams
distributed sensor networks
multiterminal source coding
network information theory
hasCodingApproach LDPC code based Slepian–Wolf coding
syndrome-based coding using linear channel codes
turbo code based Slepian–Wolf coding
implies correlation can be exploited at the decoder
no rate loss compared to joint encoding for lossless compression
influenced correlation-aware compression algorithms
development of distributed video coding
inspired practical Slepian–Wolf codes based on channel codes
introducedIn 1973
involves asymptotically long block lengths
isSpecialCaseOf multiterminal source coding theory
namedAfter David Slepian NERFINISHED
Jack Wolf NERFINISHED
publishedIn IEEE Transactions on Information Theory NERFINISHED
rateConstraint R_X + R_Y ≥ H(X,Y)
R_X ≥ H(X|Y)
R_Y ≥ H(Y|X)
relatedTo Shannon source coding theorem NERFINISHED
Wyner–Ziv theorem NERFINISHED
network coding
shows side information at the decoder is sufficient for optimal compression rates
states each individual rate must be at least conditional entropy given the other source
sum of individual rates must be at least joint entropy of sources
usesConcept conditional entropy
entropy
joint entropy
joint typicality decoding
typical sequences

Referenced by (2)

Full triples — surface form annotated when it differs from this entity's canonical label.

David Slepian notableWork Slepian–Wolf coding theorem
David Slepian notableConcept Slepian–Wolf coding theorem
this entity surface form: Slepian–Wolf bound