Abstract

This document contains a running list of recommended homework exercises.

(2024)

Poisson processes

Suppose a shop that operates daily in the time interval \([a,b]\). It has customers arriving according to a Poisson process of intensity \(3\) in the time interval \([a, c)\), and a Poisson process of intensity \(5\) in the time interval \([c,b)\); here \(a\) and \(b\) are known, but \(c\) is unknown. You can imagine the shop keeper notices that at some point in the day, the shop seems to get busier. The shop keeper has a log of all the arrival times, for each of \(n\) days of operation, where \(n\) is large.

  • Given an open interval \((r,s) \subset [a,b]\), explain how you can use the shop keeper’s log to make a good guess at whether or not \((r,s)\) contains the unknown time \(c\); show that as \(n \to \infty\) you will know with certainty whether \(c \in (r,s)\). Carefully explain your answer.

  • Demonstrate your answer by running simulations; for example, choose \(a=0\), \(b=8\), and \(c=4\), and simulate the arrivals to generate the shop keeper’s log. Now apply your method with the intervals \((2.7, 4.3)\) and \((5,6)\).

Branching processes

For historial purposes see the original problem posed by Galton. Consider the problem in the case \(p_0 =1/16\), \(p_1 = 3/8\), \(p_2 = 3/8\), \(p_3 = 1/8\), \(p_4 = 1/16\), with \(Z_0=1\).

Markov chains (Estimating and coding)

The sample output of a Markov chain is available here.

  • Based reasonably on this output, generate and continue the Markov chain for another \(50\) steps.

  • Explain why your procedure is reasonable and display the output of your generated \(50\) steps.

Markov chains (Doeblin’s coupling)

Consider Doeblin’s coupling and let \(T\) be the random time when two Markov chains \(X\) and \(Y\) meet. Suppose one is started at stationarity, and the common transition matrix is irreducible and aperiodic on a finite state space. Does \(X_T\) have the stationary distribution? Explain

Coupling

Let \(\epsilon >0\). Suppose that \(X\) is a Poisson random variable of mean \(\lambda >0\), and \(Y_{\epsilon}\) is a Poisson random variable of mean \(\lambda + \epsilon\). Use the power of coupling to upper bound the total variational distance of \(X\) and \(Y\) as a function of \(\epsilon\), and show that the distance converges to zero, as \(\epsilon \to 0\).

(2023)

Class 1

  • Consider your favorite triangle \(T\), with base lying flat on the \(x\)-axis. Compute the joint density \(f(x,y)\) for a uniform random that is uniformly distributed on \(T\). Are the marginal distributions uniform? Explain

  • Find a bijective transformation from a uniform random variable on the unit interval to one that is uniform on your triangle.

  • A simple example of a structural causal model (SCM), is given by a directed graph on two vertices, \(2 \to 1\), where random variables \(X_2\) and \(X_1\) have the following functional relation: \(X_1 = \phi_1(X_2, \epsilon_1)\), for some deterministic function \(\phi_1\), and noise \(\epsilon_1\) that is uniformly distributed on the unit interval, and independent of \(X_2\); here we might think of \(2\) as causing \(1\). Show that if we are given a joint distribution on two random variables, this can always be represented as a SCM.

Class 2

  • The Perron–Frobenius theorem ensures that an aperiodic irreducible transition matrix \(P\) has a unique stationary distribution \(\pi\) and all other eigenvectors will have absolute value less than \(1\). Use this fact to show that \(p_{ij}(n) \to \pi(j)\). See also.

  • Are all transition matrices diagonalizable?

  • Consider simple symmetric random walk \(S\) on \(\mathbb{Z}\). Suppose \(S_0 = 0\) Prove that for every \(k\) even, we have \(d_{TV}(S_n, S_n+k) \to 0\), as \(n \to \infty\).

Class 3

  • We saw that if \(X = (X_0, X_1, \ldots, )\) was a stationary sequence, with \(X_i \in \mathbb{R}\), then this gave rise to a measure-preserving \((\mathbb{R}^{\mathbb{N}}, \mu, T)\), where \(\mu(\cdot) = \mathbb{P}(X \in \cdot)\), and \(T\) is the left-shift. Now suppose that \((\Omega, \mu, T)\) is an arbitrary measure-preserving system, what would you say is the corresponding stationary sequence?

Class 4

  • Suppose we are given the exponential inter-arrival time description of a Poisson process. Show that conditional that are exactly two points in the interval \([0,1]\), then the distribution of those two points are given by two independent uniforms.

  • Suppose \(Z = (Z_i)_{i \in \mathbb{Z}}\) are i.i.d. real-valued random variables. Consider the point process \(\Pi\) with points given by \[\{ i + Z_i: i \in \mathbb{Z} \}.\] Show that \(\mathbb{E}(\Pi[0,1) ) = 1\).

From (2022)

Week 1 (2022)

  • Using the law of large numbers, carefully justify why a probability histogram of large size sample from a pdf \(f\) ends up looking like \(f\).

  • Let \(T \geq 1\) be a positive integer-valued random variable. Let \(X_1, X_2, \ldots\) be iid random variables, with finite expectation. Let \(S_n = X_1 + \cdots +X_n\). Show that it may not be true that \[\mathbb{E}(S_T) = \mathbb{E}(T) \cdot \mathbb{E}(X_1).\]

  • Show by example that the pairwise independence of random variables is a weaker property than the (full) independence of random variables.

  • Compute using the law of large numbers and simulations the following integral: \[\int_0 ^ 5 \sin(x) e^x dx;\] hint: identify a random variable that has this integral for its expectation.

  • Let \(X\) be a continuous random variable, with cdf \(F\). Show that \(F(X)\) is uniformly distributed on \([0,1]\); for simplicity, you may assume that \(F\) is invertible.

  • Assuming that Python or R only has uniform random variables available, code (in two ways) a geometric random variable with support on the positive integers. Simulate your geometric random variable, and demonstrate by simulations that you have correctely coded it.

  • Demonstrate, by simulations, that the sum of two independent Poisson random variables is again a Poisson random variable; in your simulations you may restrict to the case of means \(2.5\) and \(3.7\).

Week 2

  • How would code a uniform point a disc without throwing away randomness, as we do in the rejection sampling?

  • How would you code a uniform point on the surface of a sphere?

  • By simulation reproduce Figure 12

  • Show that it is indeed enough to consider the Weierstrass theorem on the unit interval.

  • What about a higher dimensional version of Weierstrass?

Week 3

  • Do Question 2

  • If Markov chains have one-step memory, what happens if we study three-step memory processes? Is the theory the same, why?

  • Suppose \(X\) and \(Y\) are jointly distributed random variables. Find a coupling such that \(Y' = \phi(X', U)\), where \(U\) is independent of \(X'\) and uniformly distributed on the unit interval.

  • Do Question 2

  • You may recall the central limit theorem for iid random variables; explore what happens with Markov chains. In particular, see Question 3

  • Suppose you are given only the data (the realization) of a Markov chain, say for one hundred steps; how would you go about generating another hundred steps of this Markov chain?

Week 4

  • So we have all this theory for Markov chains, where you have to look back at one-step; what happens if you allow two-step memory?

  • Suppose I prove a law of large numbers type result for Markov chains started at stationarity. How would I extend this proof to any starting distribution?

  • Find an example where the sum of two dependent Poisson random variables is again a Poisson. Hint: perturb the joint mass function of two independent Poisson random variables.

  • Explore that happens if instead of starting with a Poisson number of iid uniforms, you use other discrete distributions.

  • Consider iid pertubations of the lattice; that is, for each \(n \in \mathbb{Z}^d\), we perturb it to obtain the perturbed lattice \[\Pi := \{n + X_n: n \in \mathbb{Z}^d\},\] where \((X_n)_{n \in \mathbb{Z}^d}\) are iid \(\mathbb{R}^d\) random variables. Thus \(\Pi\) is a point process– a random scattering points in \(\mathbb{R}^d\). Show that mass is conserved, so that the expected number of points in the set \([0,1)^d\) is (still) one. (Optional)

Week 5

  • From the modelling assumptions of Poisson processes, show that if the unit interval contains exactly one point, then that point is uniformly distributed.

    • Extend your proof to the case of two or more points.
  • How would you simulate a uniform on the surface of an ellipse?

  • More Poisson questions here

From (2021)

Basics

  • Using the law of large numbers, carefully justify why a probability histogram of large size sample from a pdf \(f\) ends up looking like \(f\).

  • Examine Exercise 2 by hand and by simulation.

  • Examine Exercise Pen and Paper by hand and by simulation.

  • Let \(X \geq 0\) be a continuous random variable with finite first moment. Prove that \[\mathbb{E} X = \int_0 ^{\infty} \mathbb{P}(X >t) dt.\] Hint: use a double integral.

  • Let \(X\) and \(Y\) be nonnegative independent continuous random variables. Prove that for \(t >0\), we have \[\mathbb{P}(XY > t) = \int_0 ^{\infty} \mathbb{P}(X >\tfrac{t}{y}) f_Y(y) dy,\] where \(f_Y\) is the probability density function for \(Y\).

  • Using the previous results prove that \[\mathbb{E}( X Y) = (\mathbb{E} X )(\mathbb{E} Y),\] assuming all the expectations are finite.

  • Recall that \(X\) has the gamma distribution with parameters \(\alpha, \beta >0\) if it has probability density function given by \[ \begin{equation} f(x; \alpha, \beta)= \frac{ \mathbf{1}_{(0, \infty)}(x)}{\beta^{\alpha} \Gamma(\alpha)} x^{\alpha-1} e^{-x /\beta}. \end{equation} \] Let \(g: [0, \infty) \to \mathbb{R}\) be a bounded continuous function and \(n \in \mathbb{Z}^{+}\). Prove that
    \[ \lim_{n \to \infty} \frac{1}{\beta^n \Gamma(n)} \int_ 0^{\infty} g(x/n) x^{n-1} e^{-x /\beta} dx =g(\beta).\]

  • Let \(Z\) be a real-valued random variable. Recall that if \(m\) is the unique point such that \(\mathbb{P}(Z \leq m) = \tfrac{1}{2}\), then it is the median of \(Z\). We say that \(Z\) is symmetric about \(m\) if for all \(c \geq 0\), we have \(\mathbb{P}(Z -m \geq c) = \mathbb{P}(Z -m \leq -c)\) . Let \(X = (X_1, \ldots, X_{2n+1})\) be a random sample from a symmetric distribution with unique median zero and order statistics given by \(Y_1 \leq Y_2 \cdots \leq Y_{n+1} \leq \cdots \leq Y_{2n+1}\). The sample median is given by \(M(X) = Y_{n+1}\).

    • Show that \(-Y_{n+1}\) has the same distribution as \(Y_{n+1}\). Hint, you can do this without fancy order statistics knowledge.

    • Assuming the expectations exist, show that \(\mathbb{E} Y_{n+1} =0\).

    • Let \(\epsilon >0\). Let \(B \sim Bin(2n+1, p)\), where \(p = \mathbb{P}(X_1 > \epsilon)\). Show that \(\mathbb{P}( Y_{n+1} > \epsilon) = \mathbb{P}(B \geq n+1)\).

    • Show that \(\mathbb{P}(B \geq n+1) \to 0\) as \(n \to \infty\). Deduce that \(M(X) \to 0\) in probability.

  • Let \(X= (X_1, \ldots,X_{2n+1})\) be a random sample from the normal distribution with unknown mean \(\mu \in \mathbb{R}\) and known unit variance. Use the previous exercises to show that the sample median is unbiased and consistent estimator for \(\mu\). Recall that a (sequence of) estimators is consistent if they converge in probability to the parameter being estimated.

  • Prove that there exists a deterministic set of positive integers \(S\) such that for every positive integer \(a\), we have \[\frac{ \big| S \cap \{ a, 2a, \ldots, na\} \big| }{n} \to \frac{1}{2}.\] Hint: choose a random subset, and show that there is an event of probability one for which it will satisfy the above requirement. Your final answer should be a deterministic set.

  • Let \(Z\) be a random variable with the standard normal distribution.

    • Show that for \(t >0\), we have \[\mathbb{P}(Z >t)\leq \frac{1}{t\sqrt{2\pi}} e^{-\tfrac{t^2}{2}}.\]

    • Show that for \(t >0\), we have \[\mathbb{P}(Z >t) \geq \frac{1}{\sqrt{2\pi}} \big(\frac{1}{t} - \frac{1}{t^3}\big) e^{-\tfrac{t^2}{2}}.\] Hint: take the derivative of the lower bound and see what you get.

  • Let \((X_i)_{i=3} ^{\infty}\) be an i.i.d. sequence of standard normal variables. Prove that almost surely \[\limsup_{n \to \infty} \frac{X_n}{\sqrt{2 \log n}} =1.\]

Coupling

  • Let \(X\) and \(Y\) are real-valued random variables. We say that \(X\) stochastically dominates \(Y\) if for all \(z \in \mathbb{R}\), we have \(\mathbb{P}(X \leq z) \leq \mathbb{P}(Y \leq z)\). Let us say that a coupling \((X', Y')\) of \(X\) and \(Y\) is monotone if \(\mathbb{P}(X' \geq Y') =1\).

    • Use the quantile coupling to show that for real-valued random variables \(X\) and \(Y\) we have that \(X\) stochastically dominates \(Y\) if and only if there exists a monotone coupling of \(X\) and \(Y\).
  • Let \(0 < q < p <1\). Let \(X \sim Bin(p)\) and \(Y \sim Bin(q)\). Find a coupling \((X', Y')\) of \(X\) and \(Y\) so that \(X' \geq Y'\).

  • Let \(f:[0,1] \to \mathbb{R}\) be a continuous function. Let \(n\geq 1\). Consider the Bernstein polynomial for \(f\) is defined by \[p(x) = \sum_{k=0}^n f(k/n){n \choose k} x^k(1-x)^{n-k}.\] Show that if \(f\) is an increasing function, then \(p\) is an increasing function.

  • Let \(X\) and \(Y\) be discrete random variables taking values on the space \(S\), with probability mass functions \(p\) and \(q\). Show that there exists a (maximal) coupling of \((X', Y')\) of \(X\) and \(Y\) such that the equality is achieved in the coupling inequality: \[d_{TV}(X, Y) = d_{TV}(X', Y') = 2 \mathbb{P}(X' \not = Y'),\]

  • Let \(X_1, \ldots, X_n\) be independent integer-valued random variables. Also let \(Y_1, \ldots, Y_n\) be independent integer-valued random variables. Set \(S=X_1 + \cdots + X_n\) and \(W = Y_1 + \cdots + Y_n\). Show that

\[d_{TV}(S, W) \leq \sum_{i=1}^n d_{TV}(X_i, Y_i).\]

Markov chains

  • We proved that for an irreducible aperiodic Markov chain \(X\) on a finite number of states \(S\), that is started at stationarity, we have that if \[V_n = \sum_{k=0} ^{n-1} \mathbf{1}[X_k=s],\]

then \(V_n/n \to \pi(s)\) in the mean-squared, where \(\pi\) is the stationary distribution. Show that the assumption that the chain is started at stationarity can be removed.

  • Prove that for an irreducible Markov chain on a finite state space \(S\), we have that for each \(s \in S\), the return time \[T = \inf \{n\geq 1: X_n=s\}\] has finite expectation, regardless of the starting distribution of the chain.

  • A measure-preserving system is a probability space \((\Omega, \mathcal{F}, \mu)\) endowed with a self-map \(T: \Omega \to \Omega\), where \(\mu \circ T^{-1} = \mu\). Verify that a Markov chain started at a stationary distribution corresponds to a measure-preserving system. Hint: consider shifting the coordinates of your Markov chain.

  • We that a measure-preserving system is ergodic if every invariant set has measure zero or one; that is, if \(\mu(A \triangle T^{-1}(A)) = 0\), then \(\mu(A) \in \{0,1\}\). Find an example of a stationary Markov chain with non-trivial invariant sets.

  • Show that the strong mixing condition given by

\[\mu(A \cap T^{-n}B) \to \mu(A) \mu(B) \text{ for all } A, B \in \mathcal{F}\]

implies ergodicity. Note by the usual measure theory arguments, verifying the above condition for a large enough class of \(A,B\) is equivalent to verifying the strong mixing condition.

  • Check that aperiodic irreducible finite state Markov chains, started at the stationary distribution, are strongly mixing. This allows us to obtain limit theorems from the general statements in ergodic theory. Hint: first consider the case corresponding to the events \(A = \{ X_1=s \}\) and \(B =\{ X_3=t\}\).

  • The von Neumann ergodic theorem gives that for any stationary ergodic process \(X = (X_k)_{k=0} ^{\infty}\) endowed the left-shift, where \((T X)_i = X_{i+1}\), we have the following convergence in the mean-squared

\[ \frac{1}{n} \sum_{k=0} ^{n-1} f( T^k X) \to \mathbb{E} f(X)\]

for all \(f\) such that \(\mathbb{E} [f(X)]^2 < \infty\). Consider the function \(f\) where

\[f(x) = \mathbf{1}[x_{2} = c, x_1 = b, x_0 =a].\] What happens for a Markov chain? Run simulations and check that everything is consistent, for a particular Markov chain.

Endnotes