Shannon entropy
E1168
Shannon entropy is a fundamental measure in information theory that quantifies the average uncertainty or information content in a random variable or message source.
Aliases (3)
- Shannon additivity axiom ×1
- Shannon information ×1
- Shannon limit ×1
Statements (50)
| Predicate | Object |
|---|---|
| instanceOf |
entropy measure
→
information theory concept → random variable functional → uncertainty measure → |
| appliesTo |
discrete probability distributions
→
discrete random variables → |
| captures |
expected codeword length lower bound in lossless compression
→
|
| dependsOn |
probability distribution of a random variable
→
|
| field |
information theory
→
|
| generalizes |
Hartley entropy
→
|
| hasFormula |
H(X) = -\sum_x p(x) \log p(x)
→
|
| introducedBy |
Claude Shannon
→
|
| introducedInWork |
A Mathematical Theory of Communication
→
|
| introducedInYear |
1948
→
|
| invariantUnder |
relabeling of outcomes
→
|
| isAdditiveFor |
independent random variables
→
|
| isConcaveIn |
probability distribution
→
|
| isMaximumWhen |
distribution is uniform
→
|
| isMinimumWhen |
distribution is degenerate
→
|
| isNonNegative |
true
→
|
| isSpecialCaseOf |
Rényi entropy
→
Tsallis entropy → |
| logarithmBaseDetermines |
unit of information
→
|
| measuredIn |
bits
→
hartleys → nats → |
| minimumValue |
0
→
|
| namedAfter |
Claude Shannon
→
|
| quantifies |
average information content of a random variable
→
average uncertainty of a random variable → |
| relatedConcept |
Kullback–Leibler divergence
→
conditional entropy → differential entropy → mutual information → relative entropy → |
| satisfies |
Shannon–Khinchin axioms
→
chain rule for entropy → |
| symbol |
H
→
|
| usedIn |
bioinformatics
→
channel coding theory → cryptography → data compression theory → ecology diversity indices → machine learning → neuroscience → signal processing → statistical mechanics → thermodynamics analogies → |
| usedToDefine |
Shannon capacity of a channel
→
entropy rate of a stochastic process → |
Referenced by (11)
| Subject (surface form when different) | Predicate |
|---|---|
|
Boltzmann–Gibbs entropy
→
Kullback–Leibler divergence → Shannon–Khinchin axioms → |
relatedTo |
|
Faddeev’s axioms
→
Shannon–Khinchin axioms → |
characterizes |
|
A Mathematical Theory of Communication
("Shannon limit")
→
|
associatedWithConcept |
|
Rényi entropy
→
|
generalizes |
|
Shannon–Khinchin axioms
("Shannon additivity axiom")
→
|
hasAxiom |
|
Claude Shannon
→
|
knownFor |
|
Claude Shannon
→
|
notableConcept |
|
Maxwell's demon thought experiment
("Shannon information")
→
|
relatedConcept |