Entropy(2): Other good properties about entropy
In the last article, I listed the 4 basic properties that make entropy a good representation of uncertainty. Here, as a continuation of the last article, I want to list some other important properties of entropy.
The following property 5-7 are summarized from [1].
Property 5: Uniform distribution with more classes has higher entropy
if . Where and are number of classes.
Property 6: Entropy is non-negative
with equality achieved when eg. .
Property 7: Change the order of the arguments does not change the entropy
Or to say it has some symmetry.
The following properties are collected from other sources. They may relate to specific distributions or specific real world problems.
Property 8: Gaussian has max entropy in continuous distribution with finite variance
The Gaussian distribution has max entropy compared to all continuous distributions covering the entire real line but having a finite mean and variance. Or to say that among all the distributions that has a fixed variance, Gaussian has the largest entropy.
Property 9: Shannon entropy is the average (expected) length of the random event
For a message or event with probability , the most efficient (i.e. compact) encoding of that message will require bits. Then for a random event with outcomes each having probability , the expected length of the event is .