Introduction

A Brownian motion \((B_t)_{t \in [0, \infty)}\) started at \(B_0=0\) are a collection of real-valued random variables defined on a probability space with the following properties. We will sometimes write \(B(t) = B_t\).

Exercise: Scaling: Let \(B\) be a Brownian motion. Fix \(u >0\). The process \(Y\) given by \(Y_t = \frac{1}{\sqrt{u}}B_{ut}\) is a Brownian motion.

The main difficulty with proving the existence of Brownian motion is the continuity requirement. Brownian motion was named after Brown (1827) who was studying pollen. The first mathematical construction is due to Wiener (1920); Kendall notes that this was before Kolmogorov’s axioms for probability.

Levy’s construction

We will inductively piece together the desired Brownian motion. We let \(D_0 = \mathbb{Z}^{+}\), \(D_1 = \frac{1}{2}D_0 = \{\tfrac{1}{2}, 1, \tfrac{3}{2}...\}\), and in general \(D_n = \{\tfrac{1}{2^n}, \tfrac{2}{2^n}, \ldots, 1, 1+ \tfrac{1}{2^n},...\}\). Set \(D = \bigcup_{n \in \mathbb{N}} D_n\) so that \(D\) is the set of all positive dyadic rationals. Note that \(D\) is a countable set and thus with the existence of a single random variable that is uniformly distributed on the unit interval we can produce iid standard normal random variables \(Y= (Y_t)_{t \in D}\). We will construct Brownian motion (without continuity) on each \(D_N\) and take a uniform limit to obtain the desired Brownian motion.

\[B_t = Y_1 + \cdots + Y_t.\]

In order to take a continuous limit, we proceed as follows.

Let
\[M_n = \sup_{t \in [0,1]} |E^n_t| = \sup_{ t \in [0,1] \cap D_n \setminus D_{n-1} }\frac{|Y_t|}{2^{(n+1)/2}} \] Thus it suffices to argue that \[\mathbb{E}\sum_{n=1} ^ {\infty} M_n = \sum_{n=1} ^ {\infty} \mathbb{E} M_n < \infty.\] Recall that \(Y_t\) are just iid standard normals and \(M_n\) is a maximal of \(2^{n-1}\) of these, with a denominator of \(2^{(n+1)/2}\). Thus this is not hard. For example, the expectation of the maximum of \(k\) iid standard normals is bounded by \(\sqrt{2 \log k}\)—the key here is that there is a \(\log\). We will explore these possibilities later.

To check the increments we just use the fact that \(B\) on \([0, \infty)\) is the continuous limit of things that have the increment property.

Baby Donsker theorem

Here, we will give a proof of the central limit theorem that can be extended to prove Donsker’s version of the CLT. The proof relies on the following coupling.

Skorokhood embedding theorem: Let \(B\) be Brownian motion. Let \(X\) be a random variable with mean zero and unit variance. There exist a stopping time \(T\) such that \(B_T \stackrel{d}{=} X\) and \(\mathbb{E}T = 1 = \mathbb{E} X^2\).

It turns out it is enough for our purposes to have a randomized stopping time \(T\); that is, the event \(\{T \leq t\}\) is a function Brownian motion up to \(t\) and \(U\) a random variable that is independent of \(B\). You may recall the Gambler’s ruin problem. In the case of Brownian motion, something similar is true, but we do not have the (basic) machinary to carry out a proof.

Lemma (Gambler’s ruin for BM): Let \(B\) be Brownian motion with \(B_0 =0\). Let \(a < 0 <b\). Let \[T=T_{a,b} = \inf\{t\geq 0: B_t \not \in (a,b)\}.\] Then \[\mathbb{E}(T) = -ab = \mathbb{E} B_T^2\] and the exit probabilities of \(a\) and \(b\) are given, respectively, by \[\frac{-a}{b-a} \text{ and } \frac{b}{b-a}.\]

Thus if \(X\) has two values \(a < 0 < b\), we can take \(T\) to be the desired stopping time. These ideas can be extended to cover all random variables with mean zero and finite variance.

Exercise: Can we just take \[T = \inf\{t\geq 0: B_T = X\}?\] Exercise: What can we do with three values?

Brownian motion has enough (strong Markov) independence properties so that it can be decomposed just as the way we did for Markov chains.

Skorokhood embedding theorem (Corollary):

Let \(B\) be Brownian motion. Let \(X\) be a random variable with mean zero and unit variance. There exist a stopping time \(T\) such that \(B_T \stackrel{d}{=} X\) and \(\mathbb{E}T = 1 = \mathbb{E} X^2\). Furthermore, if \(X_1, \ldots, X_n\) are iid all with the same distribution as \(X\), then there exists a sequence of stopping times \(T_0=0,T_1, \ldots, T_n\) with independent increments \(T_k - T_{k-1} \stackrel{d}{=} T\) and \(S_n \stackrel{d}{=} B(T_n)\).

Proof of CLT: Set \[W_n(t) = B(nt)/\sqrt{n} \stackrel{d}{=} B(t).\]
Then by the Skorokhood Corollary, we have \[S_n/\sqrt{n} \stackrel{d}{=} B(T_n)\sqrt{n} = W_n(T_n/n).\] By the law of large numbers, we that \(T_n/n \to 1\); we just need convergence in probability. By standard (weak convergence) arguments, we obtain that \[S_n/\sqrt{n} \xrightarrow{ \text{dist} } W(1) \sim N(0,1).\]

Endnotes