Abstract

This document contains a running list of recommended homework exercises.

Week 1 (2022)

Week 2

Week 3

Week 4

Week 5

From (2021)

Basics

Coupling

\[d_{TV}(S, W) \leq \sum_{i=1}^n d_{TV}(X_i, Y_i).\]

Markov chains

then \(V_n/n \to \pi(s)\) in the mean-squared, where \(\pi\) is the stationary distribution. Show that the assumption that the chain is started at stationarity can be removed.

\[\mu(A \cap T^{-n}B) \to \mu(A) \mu(B) \text{ for all } A, B \in \mathcal{F}\]

implies ergodicity. Note by the usual measure theory arguments, verifying the above condition for a large enough class of \(A,B\) is equivalent to verifying the strong mixing condition.

\[ \frac{1}{n} \sum_{k=0} ^{n-1} f( T^k X) \to \mathbb{E} f(X)\]

for all \(f\) such that \(\mathbb{E} [f(X)]^2 < \infty\). Consider the function \(f\) where

\[f(x) = \mathbf{1}[x_{2} = c, x_1 = b, x_0 =a].\] What happens for a Markov chain? Run simulations and check that everything is consistent, for a particular Markov chain.

Endnotes