Entropy of a probability distribution: Difference between revisions
Jump to navigation
Jump to search
imported>Ragnar Schroder (Initial stub) |
imported>Ragnar Schroder (Small edit in header) |
||
Line 1: | Line 1: | ||
The '''entropy''' of a [[probability distribution]] is a number that describes the degree of uncertainty the distribution represents. | The '''entropy''' of a [[probability distribution]] is a number that describes the degree of uncertainty or disorder the distribution represents. | ||
==Examples== | ==Examples== |
Revision as of 10:00, 27 June 2007
The entropy of a probability distribution is a number that describes the degree of uncertainty or disorder the distribution represents.
Examples
Assume we have a set of two mutually exclusive propositions (or equivalently, a random experiment with two possible outcomes). Assume all two possiblities are equally likely.
Then our advance uncertainty about the eventual outcome is rather small - we know in advance it will be one of exactly two known alternatives.
Assume now we have a set of a million alternatives - all of them equally likely - rather than two.
It seems clear that our uncertainty now about the eventual outcome will be much bigger.
Formal definitions
- Given a discrete probability distribution function f, the entropy H of the distribution is given by
- Given a continuous probability distribution function f, the entropy H of the distribution is given by