Convergence of random variables
Suppose we have an infinite sequence of random variables
1 Different notions of convergence
Recall that a sequence
Convergence of a sequence of random variables is much different. We have already seen one form of convergence: convergence in distribution
There are two ways in which this notion can be extended to sequence of random variables:
Convergence in probability. A sequence
of random variables converges to a random variale in probability if Or, equivalently, for any and , there exists a such that for all , we have We denote such convergence as .Almost sure convergence. A sequence
of random variables converges to a random variable almost surely if Or, equivalently, for any , We denote such convergence as .
Explanation adapted from math.stackexchange [1] and [2]
Limits of sets is easy to describe when we have weakly increasing or weakly decreasing sequence of sets. In particular, if
What happens when a sequence
The limit of
Another way to think about these definitions is as follows.
Similarly,
2 Examples of convergence in probability
Example 1 Consider a probability space
Note that
Example 2 Consider the same setup as Example 1, but the random variable
This can be solved in exactly the same manner as Example 1.
Example 3 Consider the same setup as Example 1, but the random variable
This can be solved in exactly the same manner as Example 1.
Example 4 Consider an i.i.d. sequence
Pick any
3 Examples of almost sure convergence
We revisit the examples of previous section.
In Example 1, for any
, the sequence is a finite sequence of ’s followed by infinite sequence of ’s. Thus, . Thus, Hence, .In Example 2, the same argument as above works.
In Example 3, a slight variation of the above argument works.
In Example 4, we proceed as follows. Fix
and consider the sequence . Since this is a decreasing sequence, it must have a limit. Denote that limit by , i.e.,Since
is a decreasing sequence, we have that . Hence, for any , (where the last inequality follows from the calculation in the solution of Example 4).The above inequality holds for every
, so we have Recall that was arbitrary. Therefore, we have shown that Thus, the only possibility is that Hence .
4 Almost sure convergence from convergence in probability.
It is possible to infer almost sure convergence from convergence in probability. For that we need to state a result. The proof is not difficult, but is omitted due to time.
Lemma 1 (Borel Cantelli Lemma) Let
There is a partial converse of Borel-Cantelli lemma.
Lemma 2 (Second Borel Cantelli Lemma) Let
An immediate implication of Borel-Cantelli lemma is the following:
Lemma 3 Suppose
In light of the above result, we revisit some variations of the examples of the previous section.
Consider a variation Example 1 where we no longer specify
as a function of but simply assume that Then for any , . Therefore, ; hence by Lemma 3, .Consider a variation Example 2 where we no longer specify
as a function of but simply assume that and are independent.Then for any
, . Therefore, ; hence by the Second Borel-Cantelli lemma, So, does not converge almost surely!
A variation of Lemma 4 is the following:
Lemma 4 Let
To simplify the notation, we assume that
Pick any
From Markov inequality, we have
5 Some properties of convergence of sequence of random variables
We now state some properties without proof.
- The three notions of convergence that we have defined are related as follows:
Fix
Let
Thus,
There are partial converss. For any constant
If is a strictly decreasing sequence thenIf
, then there exists a subsequence such that converges almost surely to . if and only if every subsequence has a sub-subsequence such that that converges to almost surely.Skorokhod’s representation theorem. If
, then there exists a sequence that is identically distributed to such that , where is identically distributed to .Continuous mapping theorems. Let
be a continuous function. Then, implies . implies . implies .
Convergence of sums.
- If
and , then . - If
and , then . - It is not true in general that
whenever and . The result is true when or is a constant.
- If
If
6 Strong law of large numbers
Theorem 1 Let
We provide a proof under the assumption that the fouth moment’s exist.
We assume that
Since we know that the forth moment exist, we can use a forth moment version of Chebyshev inequality:
Then, by the multinomial theorem, we have
Since the
.- = = 0$.
Therefore,
7 There’s more
There’s another type of convergence commonly used in engineering: convergence in mean-squared sense. In the interest of time, we will not study this notion in class.