The mail goal of this session is to prove a version of the strong law of large numbers, assuming an *extra* finite fourth moment condition.

Let \(X_1, X_2, \ldots\) be random variables on a probability space \((\Omega, {\mathcal F}, {\mathbb P})\). We say that \(X_n\) converges to a random variable \(X\) if for all \(\omega \in \Omega\), we have that \(X_n(\omega) \to X(\omega)\). We say that \(X_n\) **converges almost surely** to \(X\) if there exists an event \(\Omega' \in {\mathcal F}\) with \({\mathbb P}(\Omega') =1\) such that for all \(\omega \in \Omega'\), we have \(X_n(\omega') \to X(\omega)\). We say that \(X_n\) **converges in probability** to \(X\) if for all \(\varepsilon>0\), we have \[\lim_{n \to \infty} {\mathbb P}( \left\{ {\omega \in \Omega: |X_n(\omega) - X(\omega)| >\varepsilon} \right\} ) = 0.\] If \({\mathbb E}|X_n|^2 < \infty\) and \({\mathbb E}|X|^2 < \infty\), and \({\mathbb E}|X_n-X|^2 \to 0\) as \(n \to \infty\), we say that \(X_n\) **converges in \(L^2\)** or the **mean-squared** to \(X\).

We previously proved using Markov’s inequality that \(L^2\) convergence implies convergence in probability. It turns out the almost-sure convergence implies convergence in probability. We have not yet given any proof of *any* version the strong law of large numbers:

Let \((A_n)_{n=1}^{\infty}\) be events in a probability space \((\Omega, {\mathcal F}, {\mathbb P})\). We say that

\[ \limsup_{n \to \infty} A_n := \bigcap_{n=1} ^ {\infty} \bigcup_{m=n}^{\infty} A_m.\]

Note that \(\limsup_{n \to \infty} A_n\) is an event also given by \[ \left\{ {\omega \in \Omega: \omega \text{ belongs to infinitely many of the } A_n} \right\} ;\] thus sometimes we write \(\limsup_{n \to \infty} A_n = \left\{ {A_n \text{ i.o}} \right\} \).

Notice that Borel-Cantelli lemma does not require any independence. Later, we will discuss a partial converse that will require independence.

The Borel-Cantelli lemma is useful for proving almost-sure convergence because of the following fact.

We prove the following version of the strong law of large number with the *extra* assumption that we have finite fourth moments.

The assumption that the \({\mathbb E}X_1=0\) can be easily removed by considering \(Y_i = X_i - {\mathbb E}X_i\). We will use the Borel-Cantelli lemma and slightly more general version of Markov’s inequality.

*Proof* (The Strong law with fourth moments). Observe that

\[{\mathbb E}S_n^4 = \sum_{i,j,k,\ell} {\mathbb E}( X_i X_j X_k X_{\ell}).\] The only terms that are non-zero are of the from \({\mathbb E}(X_i^4)\) and \({\mathbb E}(X_i^2 X_j^2) = {\mathbb E}X_i^2 {\mathbb E}X_j^2\) where \(i \not = j\). Thus **some** counting yields
\[ {\mathbb E}S_n^4 = n{\mathbb E}X_1^4 +3n(n-1) {\mathbb E}X_1^2 {\mathbb E}X_2 ^2.\]

Markov’s inequality gives that

\[ \varepsilon^4 n^4 {\mathbb P}(|S_n|/n >\varepsilon) \leq {\mathbb E}S_n^4;\] thus Borel-Cantelli and fact about convergence yields the desired result.The Borel-Cantelli lemmma has the following converse in the case where we *do* assume independence of the sets.

*Proof. * By De Morgan’s laws, it suffices to show that
\[\begin{equation}
{\mathbb P}\big( \bigcup_{m=1} ^{\infty} \bigcap _{n=m} ^{\infty} A_n^c \big) =0.
\end{equation}\]
We will show that for each \(m \geq 1\), we have
\[ \lim_{N \to \infty} {\mathbb P}\big( \bigcap_{n=m} ^{N} A_n^c \big) = 0,\]
from which the desired result follows from the continuity of probability measure.

We already saw how powerful the first Borel-Cantelli lemma is–it let us prove an almost-sure strong law! Typical maths textbook practice-type applications of the Borel-Cantelli include exercises like following.

**Exercise 3.1 **
Let \((X_i)_{i=3} ^{\infty}\) be an i.i.d. sequence of standard normal variables. Prove that almost surely \[\limsup_{n \to \infty} \frac{X_n}{\sqrt{2 \log n}} =1.\] Hint: recall that for \(t >0\), we have
\[ \frac{1}{\sqrt{2\pi}} \big(\frac{1}{t} - \frac{1}{t^3}\big) e^{-\tfrac{t^2}{2}} \leq {\mathbb P}(X_3 >t)\leq \frac{1}{t\sqrt{2\pi}} e^{-\tfrac{t^2}{2}}.\]

*Solution. * We will apply the Borel-Cantelli lemma in its first form to show that
\[\limsup_{n \to \infty} \frac{X_n}{\sqrt{2 \log n}} \leq 1\]
and use its second form (where we will use the assumed independence) to show that
\[ \limsup_{n \to \infty} \frac{X_n}{\sqrt{2 \log n}} \geq 1.\] We will also employ the upper and lower bounds on the tail of a standard normal given by the hint.

Let \(\varepsilon>0\). We have \[\begin{eqnarray*} {\mathbb P}\Big ( \frac{X_n}{\sqrt{2 \log n}} > 1 + \varepsilon\Big ) &=& {\mathbb P}\Big ( X_n > (1 + \varepsilon) \sqrt{2\log n} \Big ) \\ &\leq & \frac{1}{ (1+\varepsilon) \sqrt{2 \log n} \sqrt{2\pi}} e^{-(1+\varepsilon)^2 \log n} \\ &\leq & \frac{1}{n^{(1+ \varepsilon)^2 } } \\ &=:& a_n. \end{eqnarray*}\] Clearly, \(a_n\) is summable, so the upper bound follows from the first form of the Borel-Cantelli lemma.

For the lower bound, note that \(\frac{1}{t} - \frac{1}{t^3} > \frac{1}{2t}\) for \(t \geq 2\). Observe that (even without an \(\varepsilon\)), \[{\mathbb P}\Big ( \frac{X_n}{\sqrt{2 \log n}} > 1\Big) \geq \frac{1}{2\sqrt{2\pi} \sqrt{2 \log n}}e^{- \log n} > \frac{c}{n \log n}=:b_n\] for some \(c >0\). We know from calculus that the partial sums of \(b_n\) diverges to \(\infty\), so the lower bound follows from the second form of the Borel-Cantelli lemma.- We stated and proved the Borel-Cantelli lemma.
- We used the Borel-Cantelli lemma to prove the strong law of large numbers with a fourth moment condition
- We also have a partial converse to the Borel-Cantelli lemma.

- Version: 30 November 2020
- Rmd Source