Disambiguation evidence for Jensen inequality via surface form
"Jensen's inequality"
As subject (56)
Triples where this entity appears as subject under the
label "Jensen's inequality".
| Predicate | Object |
|---|---|
| appliesTo | concave functions ⓘ |
| appliesTo | convex functions ⓘ |
| coreStatement | For a concave function φ and random variable X, φ(E[X]) ≥ E[φ(X)] ⓘ |
| coreStatement | For a convex function φ and random variable X, φ(E[X]) ≤ E[φ(X)] ⓘ |
| equalityCondition | convex function is affine on the support of the random variable ⓘ |
| equalityCondition | random variable is almost surely constant ⓘ |
| field | convex analysis ⓘ |
| field | measure theory ⓘ |
| field | probability theory ⓘ |
| field | real analysis ⓘ |
| generalizationOf | Cauchy–Schwarz inequality in some formulations ⓘ |
| generalizationOf | inequality between arithmetic and geometric means ⓘ |
| generalizationOf | inequality between arithmetic and harmonic means ⓘ |
| hasVariant | conditional Jensen's inequality ⓘ |
| hasVariant | matrix Jensen inequality ⓘ |
| hasVariant | operator Jensen inequality ⓘ |
| holdsFor | continuous distributions ⓘ |
| holdsFor | discrete distributions ⓘ |
| holdsFor | finite sums ⓘ |
| holdsFor | integrals ⓘ |
| holdsFor | probability measures ⓘ |
| implies | E[|X|^p] ≥ |E[X]|^p for p ≥ 1 ⓘ |
| implies | log E[X] ≥ E[log X] for positive X and concave log ⓘ |
| instanceOf | mathematical inequality ⓘ |
| instanceOf | result in convex analysis ⓘ |
| instanceOf | result in probability theory ⓘ |
| namedAfter | Johan Jensen ⓘ |
| relatedTo |
Kullback–Leibler divergence
ⓘ
surface form:
Gibbs' inequality
|
| relatedTo | Karamata's inequality ⓘ |
| relatedTo | Young's inequality ⓘ |
| relatedTo | convex combination ⓘ |
| relatedTo | epigraph of a convex function ⓘ |
| relatedTo | majorization theory ⓘ |
| relatedTo | supporting hyperplane ⓘ |
| relates | expectation of a function ⓘ |
| relates | expectation of a random variable ⓘ |
| relates | function of an expectation ⓘ |
| requires | convexity of the function on the range of the random variable ⓘ |
| requires | integrable random variable ⓘ |
| timePeriod | early 20th century ⓘ |
| usedFor | Hölder-type inequalities ⓘ |
| usedFor | Jensen–Shannon divergence properties ⓘ |
| usedFor |
Kullback–Leibler divergence
ⓘ
surface form:
Kullback–Leibler divergence inequalities
|
| usedFor | Minkowski inequality proofs ⓘ |
| usedFor | bounding expectations ⓘ |
| usedFor | bounding moments of random variables ⓘ |
| usedFor | convex optimization analysis ⓘ |
| usedFor | deriving other inequalities ⓘ |
| usedFor | entropy bounds ⓘ |
| usedFor | evidence lower bound (ELBO) derivation ⓘ |
| usedFor | information theory inequalities ⓘ |
| usedFor | machine learning generalization bounds ⓘ |
| usedFor | proving convergence results ⓘ |
| usedFor | proving law of large numbers variants ⓘ |
| usedFor | risk measures in finance ⓘ |
| usedFor | variational inference ⓘ |