The mail goal of this session is to prove a version of the strong law of large numbers, assuming an extra finite fourth moment condition.

1 Convergence

Let \(X_1, X_2, \ldots\) be random variables on a probability space \((\Omega, {\mathcal F}, {\mathbb P})\). We say that \(X_n\) converges to a random variable \(X\) if for all \(\omega \in \Omega\), we have that \(X_n(\omega) \to X(\omega)\). We say that \(X_n\) converges almost surely to \(X\) if there exists an event \(\Omega' \in {\mathcal F}\) with \({\mathbb P}(\Omega') =1\) such that for all \(\omega \in \Omega'\), we have \(X_n(\omega') \to X(\omega)\). We say that \(X_n\) converges in probability to \(X\) if for all \(\varepsilon>0\), we have \[\lim_{n \to \infty} {\mathbb P}( \left\{ {\omega \in \Omega: |X_n(\omega) - X(\omega)| >\varepsilon} \right\} ) = 0.\] If \({\mathbb E}|X_n|^2 < \infty\) and \({\mathbb E}|X|^2 < \infty\), and \({\mathbb E}|X_n-X|^2 \to 0\) as \(n \to \infty\), we say that \(X_n\) converges in \(L^2\) or the mean-squared to \(X\).

We previously proved using Markov’s inequality that \(L^2\) convergence implies convergence in probability. It turns out the almost-sure convergence implies convergence in probability. We have not yet given any proof of any version the strong law of large numbers:

Theorem 1.1 (Law of large numbers (almost-sure version) Let \((X_n)_{n \in \mathbb{Z}^{+}}\) be a sequence of independent and identically distributed (i.i.d.) random variables. If \(\mathbb{E} |X_1| < \infty\), then \[ n^{-1}(X_1 + \cdots + X_n) \to \mathbb{E} X_1\] almost-surely.

2 The Borel-Cantelli lemma I

Let \((A_n)_{n=1}^{\infty}\) be events in a probability space \((\Omega, {\mathcal F}, {\mathbb P})\). We say that

\[ \limsup_{n \to \infty} A_n := \bigcap_{n=1} ^ {\infty} \bigcup_{m=n}^{\infty} A_m.\]

Note that \(\limsup_{n \to \infty} A_n\) is an event also given by \[ \left\{ {\omega \in \Omega: \omega \text{ belongs to infinitely many of the } A_n} \right\} ;\] thus sometimes we write \(\limsup_{n \to \infty} A_n = \left\{ {A_n \text{ i.o}} \right\} \).

Theorem 2.1 (Borel-Cantelli Lemma) Let \((A_n)_{n=1}^{\infty}\) be events in a probability space \((\Omega, {\mathcal F}, {\mathbb P})\). If \(\sum_{n=1}^{\infty} {\mathbb P}(A_n) < \infty\), then \({\mathbb P}(A_n \text{ i.o}) = 0\).


Proof. Note that for every \(k\) we have that \[{\mathbb P}(A_n \text{ i.o}) \leq {\mathbb P}(\bigcup_{m=k}^{\infty} A_m) \leq \sum_{m=k}^{\infty} {\mathbb P}(A_m).\] Since we know that the infinite sum is finite, taking \(k \to \infty\) gives the desired result.

Notice that Borel-Cantelli lemma does not require any independence. Later, we will discuss a partial converse that will require independence.

The Borel-Cantelli lemma is useful for proving almost-sure convergence because of the following fact.

Exercise 2.1 (Convergence) Show that \(X_n\) converges almost surely to \(X\) if and only if the events given by \(A_{n, \varepsilon} = \left\{ {\omega \in \Omega: |X_n(\omega) - X(\omega)| > \varepsilon} \right\} \) are such that \({\mathbb P}(A_{n, \varepsilon} \text{ i.o.})=0\) for all \(\varepsilon>0\).

3 A version of the strong law

We prove the following version of the strong law of large number with the extra assumption that we have finite fourth moments.

Theorem 3.1 (Strong law with fourth moments) Let \((X_i)_{i=1}^{\infty}\) be i.i.d. random variables with \({\mathbb E}X_1 =0\) and \({\mathbb E}X_1^4 < \infty\). If \(S_n = X_1 + \cdots + X_n\), then \(S_n/n\) converges in almost surely to \(0\).


The assumption that the \({\mathbb E}X_1=0\) can be easily removed by considering \(Y_i = X_i - {\mathbb E}X_i\). We will use the Borel-Cantelli lemma and slightly more general version of Markov’s inequality.

Lemma 3.1 (Markov’s inequality) Let \(a \geq 0\), then for any increasing function \(g: [0, \infty) \to [0, \infty)\), we have \[g(a) {\mathbb P}(|X| \geq a) \leq {\mathbb E}g(|X|).\]
Proof. Since \(g\) is increasing, \[ g(a) \mathbf{1}[ |X| \geq a] \leq g(|X|).\] Taking expectations on both sides, we obtain the required result.


Proof (The Strong law with fourth moments). Observe that
\[{\mathbb E}S_n^4 = \sum_{i,j,k,\ell} {\mathbb E}( X_i X_j X_k X_{\ell}).\] The only terms that are non-zero are of the from \({\mathbb E}(X_i^4)\) and \({\mathbb E}(X_i^2 X_j^2) = {\mathbb E}X_i^2 {\mathbb E}X_j^2\) where \(i \not = j\). Thus some counting yields \[ {\mathbb E}S_n^4 = n{\mathbb E}X_1^4 +3n(n-1) {\mathbb E}X_1^2 {\mathbb E}X_2 ^2.\]

Markov’s inequality gives that

\[ \varepsilon^4 n^4 {\mathbb P}(|S_n|/n >\varepsilon) \leq {\mathbb E}S_n^4;\] thus Borel-Cantelli and fact about convergence yields the desired result.

3.1 Borel-Cantelli II

The Borel-Cantelli lemmma has the following converse in the case where we do assume independence of the sets.

Theorem 3.2 (Borel-Cantelli (second form)) Let \(A_1, A_2, \ldots\) be a sequence of independent events with \[\begin{equation} \sum_{i=1} ^{\infty} {\mathbb P}(A_i) = \infty. \end{equation}\] Then \({\mathbb P}(A_n \ i.o.) = 1\).


Proof. By De Morgan’s laws, it suffices to show that \[\begin{equation} {\mathbb P}\big( \bigcup_{m=1} ^{\infty} \bigcap _{n=m} ^{\infty} A_n^c \big) =0. \end{equation}\] We will show that for each \(m \geq 1\), we have \[ \lim_{N \to \infty} {\mathbb P}\big( \bigcap_{n=m} ^{N} A_n^c \big) = 0,\] from which the desired result follows from the continuity of probability measure.

The independence assumption gives \[\begin{eqnarray*} {\mathbb P}\big( \bigcap_{n=m} ^{N} A_n^c \big) &=& \prod_{n=m} ^N {\mathbb P}(A_n^c) \\ &=& \prod_{n=m} ^N (1- {\mathbb P}(A_n)) \\ &\leq& \prod_{n=m} ^N e^{-{\mathbb P}(A_n)} \\ &=& \exp \big( \sum_{n=m} ^N {\mathbb P}(A_n) \big), \end{eqnarray*}\] here we also used the inequality \(1 - x \leq e^{-x}\) for all \(x \geq0\). By the divergence assumption, the last expression goes to \(0\) as \(N \to \infty\).


We already saw how powerful the first Borel-Cantelli lemma is–it let us prove an almost-sure strong law! Typical maths textbook practice-type applications of the Borel-Cantelli include exercises like following.

Exercise 3.1 Let \((X_i)_{i=3} ^{\infty}\) be an i.i.d. sequence of standard normal variables. Prove that almost surely \[\limsup_{n \to \infty} \frac{X_n}{\sqrt{2 \log n}} =1.\] Hint: recall that for \(t >0\), we have \[ \frac{1}{\sqrt{2\pi}} \big(\frac{1}{t} - \frac{1}{t^3}\big) e^{-\tfrac{t^2}{2}} \leq {\mathbb P}(X_3 >t)\leq \frac{1}{t\sqrt{2\pi}} e^{-\tfrac{t^2}{2}}.\]


Solution. We will apply the Borel-Cantelli lemma in its first form to show that \[\limsup_{n \to \infty} \frac{X_n}{\sqrt{2 \log n}} \leq 1\] and use its second form (where we will use the assumed independence) to show that \[ \limsup_{n \to \infty} \frac{X_n}{\sqrt{2 \log n}} \geq 1.\] We will also employ the upper and lower bounds on the tail of a standard normal given by the hint.

Let \(\varepsilon>0\). We have \[\begin{eqnarray*} {\mathbb P}\Big ( \frac{X_n}{\sqrt{2 \log n}} > 1 + \varepsilon\Big ) &=& {\mathbb P}\Big ( X_n > (1 + \varepsilon) \sqrt{2\log n} \Big ) \\ &\leq & \frac{1}{ (1+\varepsilon) \sqrt{2 \log n} \sqrt{2\pi}} e^{-(1+\varepsilon)^2 \log n} \\ &\leq & \frac{1}{n^{(1+ \varepsilon)^2 } } \\ &=:& a_n. \end{eqnarray*}\] Clearly, \(a_n\) is summable, so the upper bound follows from the first form of the Borel-Cantelli lemma.

For the lower bound, note that \(\frac{1}{t} - \frac{1}{t^3} > \frac{1}{2t}\) for \(t \geq 2\). Observe that (even without an \(\varepsilon\)), \[{\mathbb P}\Big ( \frac{X_n}{\sqrt{2 \log n}} > 1\Big) \geq \frac{1}{2\sqrt{2\pi} \sqrt{2 \log n}}e^{- \log n} > \frac{c}{n \log n}=:b_n\] for some \(c >0\). We know from calculus that the partial sums of \(b_n\) diverges to \(\infty\), so the lower bound follows from the second form of the Borel-Cantelli lemma.

4 Summary