Jensen–Shannon divergence
E837388
distance-like measure between probability distributions
information-theoretic measure
statistical divergence
Jensen–Shannon divergence is a symmetrized and smoothed measure of dissimilarity between probability distributions, widely used in information theory and machine learning.
Statements (51)
| Predicate | Object |
|---|---|
| instanceOf |
distance-like measure between probability distributions
ⓘ
information-theoretic measure ⓘ statistical divergence ⓘ |
| alsoKnownAs | Jensen–Shannon distance (square root form) NERFINISHED ⓘ |
| basedOn | Kullback–Leibler divergence NERFINISHED ⓘ |
| bounded | true ⓘ |
| canBeExpressedUsingEntropy | true ⓘ |
| definedFor | pairs of probability distributions ⓘ |
| definedOn |
continuous probability distributions (via densities)
ⓘ
discrete probability distributions ⓘ |
| entropyForm | JSD(P‖Q) = H(M) − 1/2 H(P) − 1/2 H(Q) ⓘ |
| field |
information theory
ⓘ
machine learning ⓘ probability theory ⓘ statistics ⓘ |
| formula | JSD(P‖Q) = 1/2 KL(P‖M) + 1/2 KL(Q‖M) ⓘ |
| generalizedDefinition | JSD({P_i}, {w_i}) = H(∑ w_i P_i) − ∑ w_i H(P_i) ⓘ |
| generalizesTo | more than two distributions ⓘ |
| isConvexInEachArgument | true ⓘ |
| isDefinedWhenSupportsDiffer | true ⓘ |
| isFdivergence | true ⓘ |
| isFinite | true ⓘ |
| isJointlyConvex | true ⓘ |
| isMetricWhenSquareRootTaken | true ⓘ |
| isRelatedTo | Shannon entropy NERFINISHED ⓘ |
| isRobustToSupportMismatchComparedTo | Kullback–Leibler divergence NERFINISHED ⓘ |
| isSmoothedVersionOf | Kullback–Leibler divergence NERFINISHED ⓘ |
| isSquareOfMetric | true ⓘ |
| isSymmetric | true ⓘ |
| isSymmetrizationOf | Kullback–Leibler divergence NERFINISHED ⓘ |
| isWidelyUsedAs | measure of dissimilarity between probability distributions ⓘ |
| isZeroIffDistributionsEqual | true ⓘ |
| logarithmBase | commonly base 2 ⓘ |
| metricName | Jensen–Shannon distance NERFINISHED ⓘ |
| mixtureDistributionDefinition | M = 1/2 (P + Q) for two distributions P and Q ⓘ |
| nonNegative | true ⓘ |
| requires | probability distributions with total mass 1 ⓘ |
| satisfiesTriangleInequalityWhenSquareRootTaken | true ⓘ |
| unit |
bits (for base-2 logarithm)
ⓘ
nats (for natural logarithm) ⓘ |
| upperBoundValue | log 2 (for base-2 logarithm and two distributions) ⓘ |
| usedIn |
GAN training objectives (via JS-based losses)
ⓘ
bioinformatics sequence comparison ⓘ clustering of probability distributions ⓘ distributional clustering of words ⓘ document similarity ⓘ generative model evaluation ⓘ natural language processing ⓘ topic modeling evaluation ⓘ |
| usesMixtureDistribution | true ⓘ |
| weightConstraints | weights w_i are nonnegative and sum to 1 ⓘ |
Referenced by (2)
Full triples — surface form annotated when it differs from this entity's canonical label.