Defining and generating randomness

  • You may recall that random variables live on sample space, but has anyone told you the address of this sample space?
  • You have generated random variables on R, how does it do that?

Theory

  • Recall that a probability space is \((\Omega, \mathcal{F}, \mathbb{P})\), where \(\Omega\) is the sample space and \(\mathcal{F}\) is the set subsets of \(\Omega\) whose elements are called events, and \(\mathbb{P}\) is a countably additive probability measure which assigns each event a number in \([0,1]\)
  • A (real-valued) random variable is map \(X : \Omega \to \mathbb{R}\), such that \(X^{-1}(I) = \{\omega \in \Omega: X(\omega) \in I\} \in \mathcal{F}\) for all open intervals \(I \subset \mathbb{R}\).
  • It’s law or distribution is given the cumulative distribution function given by \[F(x) = \mathbb{P}(X \leq x) = \mathbb{P}( \{ \omega \in \Omega: X(\omega) \leq x\}).\]
  • Many properties about \(X\) just depend on its law, so in elementary modules on probability it is easy to forget that random variables live on sample space.

Can we forget about sample space?

  • One reason why you haven’t heard much about sample spaces is because it is not easy to prove the existence of a sample space that can support a random variable that is uniformly distribution on the unit interval.
  • However, if you are ready to assume that existence of even one such random variable, on some sample space, this is enough to do most of probability theory.

The inverse transform method

  • Let \(F\) is the cdf of a continuous real-valued random variable. For simplicity, assume that \(F\) is increasing, so that \(F^{-1}\) exists.
  • Notice that the random variable \(X=F^{-1}(U)\) has cdf \(F\): \[ \begin{eqnarray*} \mathbb{P}( F^{-1}(U) \leq x) &=& \mathbb{P}(U \leq F(x)) \\ &=& F(x) \end{eqnarray*} \]
  • If we need to generate say a Bernoulli \(p\) random variable, then we can set \(\phi(U) = 1\) if \(U \in [0,p]\) and \(\phi(U)=0\) if \(U \in [p, 1]\).
  • Other discrete distributions can be dealt with similarly.

Independent random variables

  • Functions of independent random variables are again independent.

  • It suffices to have independent uniform random variables.

  • If \(U\) is uniformly distributed on \([0,1]\), express \(U\) as a \[ U= X_1 X_2 X_3 X_4 \cdots\] be the usual base-\(10\) expansion of \(U\), so that \(X_i\) are random variables that take integer-values \(0,1,2, \ldots, 9\).

  • It is not hard to show that \(X_i\) are independent and uniformly distributed.

  • By rearranging and recombining these digits \(X_i\), from one \(U\) uniform random variables, you can produce as many independent ones as you like.

What does R do?

  • We already saw that the key is uniform random variables, which can be thought as a sequence of independent digits.
  • R simulates a uniform random variable
  • This is an active area of research
  • Lehmer random number generator: take \(m\) to be prime, set \(0 < X_0 < m\): \[X_{k+1} = a \cdot X_k \bmod m\]
    • \(a\) is chosen to be a primitive root.
  • The \(X_k\) behave like independent random variables that are uniformly distributed in \(0, 1, \ldots, m-1\).

Box-Mueller for normal random variables

  • The inverse transform method is often too slow

  • The Box-Muller method for generating normal random variables. Ann. Math. Stat.  1958

  • Let \(U\) and \(V\) be independent random variables that uniformly distributed on the unit interval. Set \[X = \sqrt{ -2 \log U} \cos (2 \pi V)\] and \[Y = \sqrt{ -2 \log U} \sin (2 \pi V).\] Then \(X\) and \(Y\) are independent standard normals.

Proof

We solve for \((U,V)\) to obtain that \[U = \exp[ -(X^2 + Y^2)/2]\] and \[V = \frac{1}{2\pi} \arctan(Y /X).\] Thus the map \[(u, v) \mapsto \big (\sqrt{ -2 \log u} \cos (2 \pi v), \sqrt{ -2 \log u} \sin (2 \pi v) \big)\] is a bijection from \((0,1) \times (0,1)\) to \(\mathbb{R}^2 \setminus \{ (0,0)\}.\)

Continued

It has Jacobian is given by \[J(x,y) = -\frac{1}{2\pi } e^{-(x^2 + y^2)/2}\] Hence the (joint) pdf for \((X,Y)\) is given by \[(x,y) \mapsto \Big( \frac{1}{\sqrt{2 \pi} } e^{-x^2/2} \cdot \frac{1}{\sqrt{2 \pi}} e^{-y^2/2} \Big),\] as desired.

Acceptance/Rejection methods: Sampling on a disc

We simulate a single uniformly distributed point on the disc, by simulating a uniform point on a square, and accept it if lands inside an inscribed disc, or reject otherwise, and repeat until we have acceptance.

point <- function(){
  z=2
  while(length(z)==1){
  x = 2*runif(1) -1
  y = 2*runif(1) -1
if (x^2 + y^2 <1) { z<-c(x,y)}
  }
  z
}

 re=replicate(500, point())
 plot(re[1,], re[2,], xlim=c(-1.1,1.1), ylim=c(-1.1,1.1), asp=1)

Acceptence/Rejection: Sampling from a pdf

  • Suppose we want to generate a random variable \(X\) with pdf \(f\).

  • Suppose \(f\) is a complicated pdf with an inverse that is difficult to compute.

  • Suppose we can easily generate \(Y\) which has pdf \(g\).

  • Suppose there exists \(M\) so that \(f(x) \leq M g(x)\).

  • Then you are in good shape!

Procedure

  • Let \(U\) be uniformly distirbuted on \([0,1]\) and independent of \(Y\)
  • Sample from \((U, Y) = (u, y)\)
  • Check if \[ u \leq \frac{f(y)}{M \cdot g(y)}\]
  • If the inequality holds, accept the value of \(y\), and report \(X=y\); reject otherwise, and repeat.

Why!

Observe that by the change of variables formula and Fubini, we have \[ \begin{eqnarray*} && \mathbb{P}(Y \leq x, MU g(Y) \leq f(Y)) \\ &=& \int_{-\infty} ^x \int_0 ^1 \mathbf{1}[Mu g(y) \leq f(y)]g(y)dudy \\ &=& \int_{-\infty} ^x \mathbb{P}[U \leq f(y)/Mg(y)] \cdot g(y) dy \\ &=& \int_{-\infty} ^x f(y)/M dy \\ &=& F(x)/M. \end{eqnarray*} \]

Why (continued)

Thus \[ \mathbb{P}(Y \leq x | MU g(Y) \leq f(Y)) = F(x).\] So, given that we accepted the distribution of \(Y\), is the distribution of \(X\), that random variable we want to sample/simulate.

A simple example

  • Consider the pdf \(f(x) = 3x^2\) on the unit interval \([0,1]\).
  • Consider the pdf \(g(x) = 1\) on the unit interval.
  • Of course, \(f(x) \leq 3 g(x)\).

Code

samplef <- function(){
x=-1
while(x==-1){
y = runif(1)
u = runif(1)
if( u*3 < 3*(y^2)){ x<- y}
}
x
}

Histogram

accepted=replicate(100000, samplef())
B = seq(0, 1, by=0.01)
hist(accepted, prob=TRUE, B)

Summary

  • We rediscovered probability spaces
  • The central role of uniform random variables in theory and practice
  • Sampling methods such as acceptance/rejection

Version: 10 October 2020