Stochastic convergence: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Hayford Peirce
(→‎Almost sure convergence: added a hyphen to "short-lived"; the rewrite now makes sense -- before I had no idea at all about the meaning of this example)
imported>Ragnar Schroder
(Adding some text to the formal definition section of almost sure convergence, filling in formal definition of convergence in probability)
Line 10: Line 10:


==Almost sure convergence==
==Almost sure convergence==
===Example===
===Example===
Consider a short-lived animal of some species.  We may note the exact amount of food the animal consumes day by day.  This sequence of numbers will be unpredictable in advance,  but we may be ''quite certain'' that one day the number will be zero,  and stay zero forever after.  
Consider a short-lived animal of some species.  We may note the exact amount of food the animal consumes day by day.  This sequence of numbers will be unpredictable in advance,  but we may be ''quite certain'' that one day the number will be zero,  and stay zero forever after.  


===Formal definition===
===Formal definition===


Let <math>X_0, X_1, ... </math> be a sequence of [[stochastic variable|stochastic variables]].  
Let <math>\scriptstyle X_0, X_1, ... </math> be an infinite sequence of [[stochastic variable|stochastic variables]] defined over a subset of R.
 
Then the actual outcomes will be an ordinary sequence of real numbers.  


If <math>P(\lim_{i \to \infty} X_i = a) = 1 </math> for some a,  then the sequence has almost sure convergence to a.
If the probability that this sequence will converge to a given real number ''a'' equals 1,  then we say the original sequence of stochastic variables has almost sure convergence to ''a''.
 
In more compact notation:
 
:If <math>P(\lim_{i \to \infty} X_i = a) = 1 </math> for some ''a'',  then the sequence has almost sure convergence to ''a''.




Line 29: Line 37:


===Formal definition===
===Formal definition===
Let <math>\scriptstyle X_0, X_1, ... </math> be an infinite sequence of [[stochastic variable|stochastic variables]] defined over a subset of R.
If there exists a real number ''a'' such that <math>\lim_{i \to \infty} P( |X_i - a| > \varepsilon) = 0 </math>
for all <math>\varepsilon >0</math>,  then the sequence has almost sure convergence to ''a''.





Revision as of 17:05, 28 June 2007

Stochastic convergence is a mathematical concept intended to formalize the idea that a sequence of essentially random or unpredictable events sometimes tends to settle into a pattern.

Four different varieties of stochastic convergence are noted:

  • Almost sure convergence
  • Convergence in probability
  • Convergence in distribution
  • Convergence in nth order mean


Almost sure convergence

Example

Consider a short-lived animal of some species. We may note the exact amount of food the animal consumes day by day. This sequence of numbers will be unpredictable in advance, but we may be quite certain that one day the number will be zero, and stay zero forever after.


Formal definition

Let be an infinite sequence of stochastic variables defined over a subset of R.

Then the actual outcomes will be an ordinary sequence of real numbers.

If the probability that this sequence will converge to a given real number a equals 1, then we say the original sequence of stochastic variables has almost sure convergence to a.

In more compact notation:

If for some a, then the sequence has almost sure convergence to a.



Convergence in probability

Example

We may keep tossing a die an infinite number of times and at every toss note the average outcome so far. The exact number thus obtained after each toss will be unpredictable, but for a fair die, it will tend to get closer and closer to the arithmetic average of 1,2,3,4,5 and 6, i.e. 3.5.


Formal definition

Let be an infinite sequence of stochastic variables defined over a subset of R.

If there exists a real number a such that for all , then the sequence has almost sure convergence to a.


Convergence in distribution

Example

Formal definition

Convergence in nth order mean

Example

Formal definition

Relations between the different modes of convergence

  • If a stochastic sequence has almost sure convergence, then it also has convergence in probability.
  • If a stochastic sequence has convergence in probability, then it also has convergence in distribution.
  • If a stochastic sequence has convergence in (n+1)th order mean, then it also has convergence in nth order mean (n>0).
  • If a stochastic sequence has convergence in nth order mean, then it also has convergence in probability.

See also

Related topics

References

External links