Entropy of a probability distribution: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Ragnar Schroder
(→‎See also: - added link to general physics+information theory entropy article)
imported>Ragnar Schroder
(→‎Formal definitions: - modifying the tex expressions.)
Line 13: Line 13:
==Formal definitions==
==Formal definitions==


#Given a [[discrete probability distribution]] function f,  the entropy H of the distribution is given by <math>H=-\sum_{i=-\infty}^{i=\infty} f(x_{i}) log_{2} f(x_{i} )</math>
#Given a [[discrete probability distribution]] function f,  the entropy H of the distribution is given by <math>H=-\sum_{\forall i : f(x_i) \ne 0}^{} f(x_{i}) log_{2} f(x_{i} )</math>
#Given a [[continuous probability distribution]] function f,  the entropy H of the distribution is given by <math>H=-\int_{-\infty}^{\infty} f(x) log_{2} f(x) dx</math>
#Given a [[continuous probability distribution]] function f,  the entropy H of the distribution is given by <math>H=-\int_{\ x: f(x) \ne 0 } f(x) log_{2} f(x) dx</math>


Note that some authors prefer to use the natural logarithm rather than base two.
Note that some authors prefer to use the natural logarithm rather than base two.

Revision as of 17:36, 4 July 2007

The entropy of a probability distribution is a number that describes the degree of uncertainty or disorder the distribution represents.

Examples

Assume we have a set of two mutually exclusive propositions (or equivalently, a random experiment with two possible outcomes). Assume all two possiblities are equally likely.

Then our advance uncertainty about the eventual outcome is rather small - we know in advance it will be one of exactly two known alternatives.

Assume now we have a set of a million alternatives - all of them equally likely - rather than two.

It seems clear that our uncertainty now about the eventual outcome will be much bigger.

Formal definitions

  1. Given a discrete probability distribution function f, the entropy H of the distribution is given by
  2. Given a continuous probability distribution function f, the entropy H of the distribution is given by

Note that some authors prefer to use the natural logarithm rather than base two.

See also

References

External links