Stochastic convergence

From Citizendium
Revision as of 10:55, 28 June 2007 by imported>Ragnar Schroder (Corrected error in placement of paragraphs, trying to write latex code for formal definition of almost sure convergence.)
Jump to navigation Jump to search

Stochastic convergence is a mathematical concept intended to formalize the idea that a sequence of essentially random or unpredictable events sometimes tends to settle into a pattern.

Four different varieties of stochastic convergence are noted:

  • Almost sure convergence
  • Convergence in probability
  • Convergence in distribution
  • Convergence in nth order mean


Almost sure convergence

Example

Consider a short lived animal of some species. We may note the exact amount of food the animal consumes day by day. This sequence of numbers will be unpredictable in advance, but we may be quite certain that one day the number will be zero, and stay zero forever after.

Formal definition

Let be a sequence of stochastic variables.

If for some a, then the sequence has almost sure convergence to a.



Convergence in probability

Example

We may keep tossing a die an infinite number of times and at every toss note the average outcome so far. The exact number thus obtained after each toss will be unpredictable, but for a fair die, it will tend to get closer and closer to the arithmetic average of 1,2,3,4,5 and 6, i.e. 3.5.


Formal definition

Convergence in distribution

Example

Formal definition

Convergence in nth order mean

Example

Formal definition

See also

Related topics

References

External links