Reweighted information inequalities
2026-03-13 • Information Theory
Information Theory
AI summaryⓘ
The authors show a new way to understand certain mathematical inequalities related to how 'spread out' or 'mixed' probability distributions are. They focus on mixtures of simpler distributions that each meet these inequalities. Their result says that if one distribution is close to such a mixture in a specific technical sense (Fisher information), then it is also close in other ways that measure difference or distance between distributions. This helps explain some behaviors seen when using algorithms to sample from complex, multimodal distributions.
log-Sobolev inequalitiestransport-information inequalitiesmixture distributionsrelative Fisher informationrelative entropytransport distanceLangevin Monte Carlomultimodal distributionsprobability measuressampling algorithms
Authors
Jonathan Niles-Weed
Abstract
We establish a variant of the log-Sobolev and transport-information inequalities for mixture distributions. If a probability measure $π$ can be decomposed into components that individually satisfy such inequalities, then any measure $μ$ close to $π$ in relative Fisher information is close in relative entropy or transport distance to a reweighted version of $π$ with the same mixture components but possibly different weights. This provides a user-friendly interpretation of Fisher information bounds for non-log-concave measures and explains phenomena observed in the analysis of Langevin Monte Carlo for multimodal distributions.