Rényi divergence

E41069

Rényi divergence is a family of information-theoretic measures that generalize Kullback–Leibler divergence to quantify the dissimilarity between probability distributions, parameterized by an order α.


Statements (48)
Predicate Object
instanceOf divergence measure
f-divergence
information-theoretic measure
statistical distance
alsoKnownAs α-divergence
appliedIn adversarial learning
differential privacy
generative modeling
variational inference
asymmetric true
category information divergence
dataProcessingInequality true for α ∈ [0,1) ∪ (1,∞)
definedFor continuous probability distributions
discrete probability distributions
dependsOn Radon–Nikodym derivative dP/dQ when P is absolutely continuous w.r.t. Q
domain pairs of probability distributions
equalsAt Kullback–Leibler divergence when α = 1
equalsZeroWhen P = Q almost surely
field information theory
machine learning
statistics
generalizes Kullback–Leibler divergence
hasContinuousLimitAt α → 1 to KL divergence
introducedBy Alfréd Rényi
introducedIn 1961
isMetric false
monotoneIn order α
namedAfter Alfréd Rényi
nonNegative true
parameterizedBy order α
relatedTo Bhattacharyya distance
Chernoff information
Hellinger distance
Tsallis divergence
relatedToAt max-divergence when α → ∞
min-divergence when α → 0
requires P absolutely continuous with respect to Q for finite value
satisfies D_α(P‖Q) ≥ 0
specialCaseAt α = 0
α = 1
α = ∞
usedIn distributional robustness
hypothesis testing
information geometry
machine learning regularization
privacy analysis
robust statistics
usedToDefine Rényi entropy

Referenced by (2)
Subject (surface form when different) Predicate
Alfréd Rényi
knownFor
Rényi entropy
relatedTo

Please wait…