next up previous
Next: Monte-Carlo Integration Up: Monte Carlo Previous: Monte Carlo

Random Number Generators

Before discussing the uses of random numbers it is useful to have some idea of how random numbers are generated on a computer. Most methods depend on a chaotic sequence. The commonest is the multiplicative congruential method which relies on prime numbers. Consider the sequence
\begin{displaymath}
x_{n+1} = (a * x_n) \mod b
\end{displaymath} (4.1)

where $x\mod y$ refers to the remainder on dividing $x$ by $y$ and $a$ and $b$ are large integers which have no common factors (often both are prime numbers). This process generates all integers less than $b$ in an apparently random order. After all $b$ integers have been generated the series will repeat itself. Thus one important question to ask about any random number generator is how frequently it repeats itself.

It is worth noting that the sequence is completely deterministic: if the same initial seed, $x_0$ is chosen the same sequence will be generated. This property is extremely useful for debugging purposes, but can be a problem when averaging has to be done over several runs. In such cases it is usual to initialise the seed from the system clock.

Routines exist which generate a sequence of integers, as described, or which generate floating point numbers in the range $0$ to $1$. Most other distributions can be derived from these. There are also very efficient methods for generating a sequence of random bits (Press et al., 1992). The Numerical Algorithms Group library contains routines to generate a wide range of distributions.

A difficult but common case is the Gaussian distribution. One method simply averages several (say $10$) uniform ($0\to 1$) random numbers and relies on the central limit theorem. Another method uses the fact that a distribution of complex numbers with both real and imaginary parts Gaussian distributed can also be represented as a distribution of amplitude and phase in which the amplitude has a Poisson distribution and the phase is uniformly distributed between $0$ and $2\pi$.

As an example of generating another distribution we consider the Poisson case,

\begin{displaymath}
p(y) = \exp(-y).
\end{displaymath} (4.2)

Then, if $p'(x)$ and $p(y)$ are the probability distributions of $x$ and $y$ respectively,
\begin{displaymath}
\int_{-\infty}^{y=f(x)} p(y') \d y' =
\int_{-\infty}^x p'(x') \d x'
\end{displaymath} (4.3)

must be true for all $x$, as the probability of finding any $y' < y$ must be the same as that for finding any $x' < x$. It follows that
\begin{displaymath}
p(y) = p'(x)\left\vert\d x\over\d y\right\vert.
\end{displaymath} (4.4)

If $x$ is uniformly distributed between $0$ and $1$, i.e. $p'(x) = 1$, then
\begin{displaymath}
p(y) = {\d x\over\d y} = \exp(-y) \qquad\rightarrow\qquad
y = -\ln x.
\end{displaymath} (4.5)

Hence, a Poisson distribution is generated by taking the logarithm of numbers drawn from a uniform distribution.


next up previous
Next: Monte-Carlo Integration Up: Monte Carlo Previous: Monte Carlo