Kullback–Leibler divergence
E6392
Kullback–Leibler divergence is a fundamental information-theoretic measure that quantifies how one probability distribution differs from a reference distribution.
Observed surface forms (5)
| Surface form | Occurrences |
|---|---|
| Gibbs' inequality | 1 |
| KL divergence | 1 |
| Kullback | 1 |
| Kullback–Leibler distance | 1 |
| Kullback–Leibler divergence inequalities | 1 |
Statements (51)
| Predicate | Object |
|---|---|
| instanceOf |
f-divergence
ⓘ
information-theoretic measure ⓘ relative entropy ⓘ statistical divergence ⓘ |
| alsoKnownAs |
Kullback–Leibler divergence
ⓘ
surface form:
KL divergence
Kullback–Leibler divergence ⓘ
surface form:
Kullback–Leibler distance
relative entropy ⓘ |
| appearsIn | Kullback and Leibler 1951 paper ⓘ |
| definedFor |
continuous probability distributions
ⓘ
discrete probability distributions ⓘ |
| domain | pairs of probability distributions ⓘ |
| equalsZeroIfAndOnlyIf | two distributions are equal almost everywhere ⓘ |
| field |
information theory
ⓘ
machine learning ⓘ probability theory ⓘ statistical inference ⓘ statistics ⓘ |
| hasProperty |
additive for independent distributions
ⓘ
convex in the pair of distributions ⓘ |
| isMetric | false ⓘ |
| isNonNegative | true ⓘ |
| isSymmetric | false ⓘ |
| minimizedBy | true data-generating distribution in maximum likelihood ⓘ |
| namedAfter |
Richard Leibler
ⓘ
Solomon Kullback ⓘ |
| quantifies |
difference between probability distributions
ⓘ
information loss when approximating one distribution with another ⓘ |
| relatedTo |
Bregman divergence
ⓘ
Jensen–Shannon divergence ⓘ Shannon entropy ⓘ cross-entropy ⓘ mutual information ⓘ |
| satisfiesTriangleInequality | false ⓘ |
| specialCaseOf | Csiszár f-divergence ⓘ |
| takesValuesIn | [0, +∞] ⓘ |
| usedAs |
loss function in classification
ⓘ
regularizer in probabilistic models ⓘ |
| usedIn |
Bayesian inference
ⓘ
density estimation ⓘ distributional reinforcement learning ⓘ feature selection ⓘ hypothesis testing ⓘ Riemannian manifolds ⓘ
surface form:
information geometry
information-theoretic clustering ⓘ language modeling ⓘ machine learning model training ⓘ maximum likelihood estimation ⓘ natural gradient descent ⓘ reinforcement learning ⓘ variational autoencoders ⓘ variational inference ⓘ |
Referenced by (12)
Full triples — surface form annotated when it differs from this entity's canonical label.
this entity surface form:
KL divergence
this entity surface form:
Kullback–Leibler distance
this entity surface form:
Kullback
subject surface form:
Jensen's inequality
this entity surface form:
Gibbs' inequality
subject surface form:
Jensen's inequality
this entity surface form:
Kullback–Leibler divergence inequalities