Entropic uncertainty

He showed that for any such functions the sum of the Shannon entropies is non-negative,

Rearranging terms, finally yields an inequality in terms of the sum of the Rényi entropies,

Taking the limit of this last inequality as α, β → 1 yields the less general Shannon entropy inequality,

The constant will be different, though, for a different normalization of the Fourier transform, (such as is usually used in physics, with normalizations chosen so that ħ=1 ), i.e.,

a low or large negative Shannon entropy means that a considerable mass of the probability distribution is confined to a set of small measure

Note that this set of small measure need not be contiguous; a probability distribution can have several concentrations of mass in intervals of small measure, and the entropy may still be low no matter how widely scattered those intervals are. This is not the case with the variance: variance measures the concentration of mass about the mean of the distribution, and a low variance means that a considerable mass of the probability distribution is concentrated in a contiguous interval of small measure.