Probability distribution: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Ragnar Schroder
(Making clear the distinction between discrete and continuous distributions, and adding a few descriptions.)
imported>Ragnar Schroder
No edit summary
Line 2: Line 2:


There are two main classes of probability distributions:  Discrete and continuous.  Discrete distributions describe variables that take on discrete values only (typically the positive integers),  while continuous distributions describe variables that can take on arbitrary values in a continuum (typically the real numbers).
There are two main classes of probability distributions:  Discrete and continuous.  Discrete distributions describe variables that take on discrete values only (typically the positive integers),  while continuous distributions describe variables that can take on arbitrary values in a continuum (typically the real numbers).
In more advanced studies,  one also comes across hybrid distributions.


The following are the most important discrete probability distributions
The following are the most important discrete probability distributions


Bernoulli - Each experiment is either a 1 with probability p or a 0 with probability 1-p.  For example, when tossing a fair coin you can assign the value of 1 to either heads or tails.  After many coin tosses you would expect the number of results for heads to equal the tails results, thus the heads probability is p=50% and 1-p=50% for the tails probability.  
Bernoulli - Each experiment is either a 1 ("success") with probability p or a 0 ("failure") with probability 1-p.  For example, when tossing a fair coin you can assign the value of 1 to either heads or tails.  After many coin tosses you would expect the number of results for heads to equal the tails results, thus the heads probability is p=50% and 1-p=50% for the tails probability.  


An experiment where the outcome follows the Bernoulli distribution is called a Bernoulli trial.  
An experiment where the outcome follows the Bernoulli distribution is called a Bernoulli trial.  


Binomial - Each experiment consists of a series of identical Bernoulli trials,  f.i. tossing a fair coin n times, and summing the outcomes.  
Binomial - Each experiment consists of a series of identical Bernoulli trials,  f.i. tossing a fair coin n times, and counting the number of successes.  


Uniform - Each experiment has a certain finite number of possible outcomes,  each with the same probability.  Throwing a fair die, f.i., has six possible outcomes,  each with the same probability.  The Bernoulli distribution with a fair coin is another example.  
Uniform - Each experiment has a certain finite number of possible outcomes,  each with the same probability.  Throwing a fair die, f.i., has six possible outcomes,  each with the same probability.  The Bernoulli distribution with a fair coin is another example.  
Line 22: Line 25:


The following are several important continuous probability distributions
The following are several important continuous probability distributions
Gaussian (or normal)


Uniform continuous
Uniform continuous


Exponential -  Given a sequence of events,  and the waiting time between two consequitive events is independent of how long we've already waited,  the time between events follows the exponential distribution.  
Exponential -  Given a sequence of events,  and the waiting time between two consequitive events is independent of how long we've already waited,  the time between events follows the exponential distribution.  
Gaussian (or normal)


Gamma
Gamma

Revision as of 22:41, 24 June 2007

Random variables have probability distributions which represent the expected results of an experiment repeated multiple times. As a simple example, consider the expected results for a coin toss experiment. While we don't know the results for any individual toss of the coin, we can expect the results to average out to be heads half the time and tails half the time (assuming a fair coin).

There are two main classes of probability distributions: Discrete and continuous. Discrete distributions describe variables that take on discrete values only (typically the positive integers), while continuous distributions describe variables that can take on arbitrary values in a continuum (typically the real numbers).

In more advanced studies, one also comes across hybrid distributions.


The following are the most important discrete probability distributions

Bernoulli - Each experiment is either a 1 ("success") with probability p or a 0 ("failure") with probability 1-p. For example, when tossing a fair coin you can assign the value of 1 to either heads or tails. After many coin tosses you would expect the number of results for heads to equal the tails results, thus the heads probability is p=50% and 1-p=50% for the tails probability.

An experiment where the outcome follows the Bernoulli distribution is called a Bernoulli trial.

Binomial - Each experiment consists of a series of identical Bernoulli trials, f.i. tossing a fair coin n times, and counting the number of successes.

Uniform - Each experiment has a certain finite number of possible outcomes, each with the same probability. Throwing a fair die, f.i., has six possible outcomes, each with the same probability. The Bernoulli distribution with a fair coin is another example.

Poisson - Given an experiment where we have to wait for an event to happen, and the expected remainding waiting time is independent of how long we've already waited. Then the number of events per unit time will be a Poisson distributed variable.

Geometric

Negative Binomial


The following are several important continuous probability distributions

Gaussian (or normal)

Uniform continuous

Exponential - Given a sequence of events, and the waiting time between two consequitive events is independent of how long we've already waited, the time between events follows the exponential distribution.

Gamma

Rayleigh

Cauchy

Laplacian