Kullback–Leibler divergence

E6392

Kullback–Leibler divergence is a fundamental information-theoretic measure that quantifies how one probability distribution differs from a reference distribution.

Jump to: Surface forms Statements Referenced by

Observed surface forms (5)


Statements (51)

Predicate Object
instanceOf f-divergence
information-theoretic measure
relative entropy
statistical divergence
alsoKnownAs Kullback–Leibler divergence
surface form: KL divergence

Kullback–Leibler divergence
surface form: Kullback–Leibler distance

relative entropy
appearsIn Kullback and Leibler 1951 paper
definedFor continuous probability distributions
discrete probability distributions
domain pairs of probability distributions
equalsZeroIfAndOnlyIf two distributions are equal almost everywhere
field information theory
machine learning
probability theory
statistical inference
statistics
hasProperty additive for independent distributions
convex in the pair of distributions
isMetric false
isNonNegative true
isSymmetric false
minimizedBy true data-generating distribution in maximum likelihood
namedAfter Richard Leibler
Solomon Kullback
quantifies difference between probability distributions
information loss when approximating one distribution with another
relatedTo Bregman divergence
Jensen–Shannon divergence
Shannon entropy
cross-entropy
mutual information
satisfiesTriangleInequality false
specialCaseOf Csiszár f-divergence
takesValuesIn [0, +∞]
usedAs loss function in classification
regularizer in probabilistic models
usedIn Bayesian inference
density estimation
distributional reinforcement learning
feature selection
hypothesis testing
Riemannian manifolds
surface form: information geometry

information-theoretic clustering
language modeling
machine learning model training
maximum likelihood estimation
natural gradient descent
reinforcement learning
variational autoencoders
variational inference

Referenced by (12)

Full triples — surface form annotated when it differs from this entity's canonical label.

Kullback–Leibler divergence alsoKnownAs Kullback–Leibler divergence
this entity surface form: KL divergence
Kullback–Leibler divergence alsoKnownAs Kullback–Leibler divergence
this entity surface form: Kullback–Leibler distance
Solomon Kullback coDeveloperOf Kullback–Leibler divergence
Solomon Kullback familyName Kullback–Leibler divergence
this entity surface form: Kullback
Rényi divergence generalizes Kullback–Leibler divergence
Solomon Kullback hasConceptNamedAfter Kullback–Leibler divergence
Solomon Kullback knownFor Kullback–Leibler divergence
Solomon Kullback notableConcept Kullback–Leibler divergence
Richard Leibler notableWork Kullback–Leibler divergence
Shannon entropy relatedConcept Kullback–Leibler divergence
Jensen inequality relatedTo Kullback–Leibler divergence
subject surface form: Jensen's inequality
this entity surface form: Gibbs' inequality
Jensen inequality usedFor Kullback–Leibler divergence
subject surface form: Jensen's inequality
this entity surface form: Kullback–Leibler divergence inequalities