Choose four out of the five questions.

Probability spaces and the generation of random variables

Define an explicit function \(\phi: [0,1] \to [0,1] \times \{1,2\}\) such that:

Solution

Let \(b:[0,1] \to \{0,1\}^{\mathbb{Z}^{+}}\) be so that \(b(u)\) is the binary expansion of \(u \in [0,1]\) this is bijective modulo a countable set. We know that \(b(U)\) is a sequence of iid Benroulli \(\tfrac{1}{2}\) random variables, from which we define \[\phi(u) = (b^{-1}[ b(u)_{n=2} ^{\infty}], b(u)_1).\]

Borel-Cantelli

Let \((X_i)_{i=2} ^{\infty}\) be i.i.d. continuous random variables with probability density function given by \[ f(x) = \frac{1}{2} |x|^3 e^{-x^4/4}.\] Find a deterministic sequence \((c_i)_{i=2}^{\infty}\) such that almost surely \[\limsup_{n\to \infty} \frac{X_n}{c_n} = 1.\]

Solution

For \(x\geq0\), we easily see that \[\mathbb{P}(X_n > x) = \frac{1}{2}e^{-x^4/4}.\] Take \[c_n = (4 \log n)^{\tfrac{1}{4}}.\] Will we show using the first Borel-Cantelli lemma that almost surely, \[\limsup_{n\to \infty} \frac{X_n}{c_n} \leq 1\] and then we will show using the second Borel-Cantelli lemma and the independence of the \(X_i\) that almost surely \[ \limsup_{n \to \infty} \frac{X_n}{c_n} \geq 1.\] Let \(\epsilon \geq 0\). We have that \[ \mathbb{P}( X_n/c_n > 1+ \epsilon) = \frac{1}{2 n^{ (1+\epsilon)^4}}=:a_n.\] The sequence \(a_n\) is summable for \(\epsilon>0\), so the upper bound follows by from the first Borel-Cantelli lemma. For the lower bound, if we set \(\epsilon=0\), then \(a_n\) is not summable, and then the lower bound follows from the second Borel-Cantelli lemma.

Markov chains

Prove that for an irreducible aperiodic Markov chain on a finite state space \(S\), we have that for each \(s \in S\), the return time

\[T := \inf\{n \geq 1: X_n=s\}\] has finite expectation, regardless of the starting distribution of the chain.

Solution

We go back to basics; in particular, some observations that were made in connection to the Doeblin coupling. If \(P\) is the transition matrix, then by the assumptions of irreducibility and aperiodicity, we know that \(P^M\) has all positive entries for some \(M>0\); let \(\delta >0\) be the smallest entry. Hence every \(M\) steps, we have a non-zero probability \(\delta >0\) of getting to the state \(s\). Thus \[ \mathbb{P}(T > kM) \leq (1- \delta)^k\] from which we easily deduce that \(T\) has finite expectation.

Poisson processes

Suppose that \(\Pi\) is a Poisson process on \(\mathbb{R}^d\). Let \(t \in \mathbb{R}^d\), and \(c \in (0, \infty)\) Prove that:

Solution

Suppose that \(\Pi\) is a Poisson point process of intensity \(\lambda\). Then \(\Pi\) satisfies:

  • \(\Pi(A)\) is a Poisson random variable with mean \(\lambda |A|\) for every set \(A\) of finite volume.

  • The random variables \(\Pi(A_1), \ldots, \Pi(A_n)\) are independent for pairwise disjoint sets \(A_1, \dots, A_n\).

It suffices to verify these properties for \(\Gamma\) and \(\Sigma\). For a subset \(A \subset \mathbb{R}^d\), let \(t + A = \{t+a: a \in A\}\) and \(cA = \{ca: a \in A\}\).

  • Observe that \(\Gamma(A) = \Pi(-t +A)\), so that the two required properties are easily verified.

  • Similarly \(\Sigma(A) = \Pi(c^{-1}A)\), and \(\Sigma\) is a Poisson point process of intensity \(c^{-1}\lambda\).

Estimating the stationary distribution

Solution

\[\frac{1}{n} \sum_{i=1}^n \mathbf{1}[X_i = s] \to \pi(s).\] Thus for each state, we simply need to count the number of occurrences and divide by \(100000\), to get its approximate probability under \(\pi\).

z =read.table("markovchain.txt", sep=",")
z= z[,]

b1 = seq(-1,5, by=1)
hist(z, prob=TRUE, breaks=b1)

sum(z==1)/100000
## [1] 0.17023
sum(z==2)/100000
## [1] 0.24957
sum(z==3)/100000
## [1] 0.41172
sum(z==4)/100000
## [1] 0.16849