Stochastic convergence: Difference between revisions
imported>Ragnar Schroder (→Related topics: - added a link) |
imported>Robert Tito mNo edit summary |
||
Line 43: | Line 43: | ||
*[[Probability]] | *[[Probability]] | ||
*[[Probability theory]] | *[[Probability theory]] | ||
*[[Differential equations]] | *[[Differential equations]] | ||
*[[Stochastic modeling]] | *[[Stochastic modeling]] | ||
== References== | == References== |
Revision as of 10:23, 28 June 2007
Stochastic convergence is a mathematical concept intended to formalize the idea that a sequence of essentially random or unpredictable events sometimes tends to settle into a pattern.
Four different varieties of stochastic convergence are noted:
- Almost sure convergence
- Convergence in probability
- Convergence in distribution
- Convergence in nth order mean
Almost sure convergence
Example
We may keep tossing a die an infinite number of times and at every toss note the average outcome so far. The exact number thus obtained after each toss will be unpredictable, but for a fair die, it will tend to get closer and closer to the arithmetic average of 1,2,3,4,5 and 6, i.e. 3.5.
Formal definition
Convergence in probability
Example
Consider a short lived animal of some species. We may note the exact amount of food the animal consumes day by day. This sequence of numbers will be unpredictable in advance, but we may be quite certain that one day the number will be zero, and stay zero forever after.
Formal definition
Convergence in distribution
Example
Formal definition
Convergence in nth order mean
Example
Formal definition
See also
Related topics