Understanding the Role of Uniform Random Variables in Monte Carlo Simulation

MIA6
Messages
231
Reaction score
0
Hi all! I started learning about Monte Carlo Simulation. However, one thing that I don't quite get is that why for generating any random variable, we have to first generate a Uniform RV? What is the reason behind that?

Thanks!
 
Physics news on Phys.org
Because that's what we know how to create. Kind of. Those random number generators are more appropriately called pseudo random number generators.
 
The idea is that the computer can only simulate random number generation, so we can't just tell it "go generate numbers according to a beta distribution". What we can do is simulate numbers a verify fairly easily that they roughly correspond to a uniform distribution, and then apply suitable conditions/transformations to make them follow other distribution. This actually turns out to be fairly difficult beyond the standard distributions; if you ever study Bayesian statistics, you'll learn some fairly complex methods that are required to simulate certain prior distributions (e.g. Metropolis-Hastings).
 
Hi, NumberNine. Yea from what I learned so far, the first step is always to generate uniform RVs first, then do some transformation to make it to follow the distribution that we want. So we use uniform RV because it is very simple? as opposed to pick exponential RV first? So it is just the way it is?
 
Number Nine said:
The idea is that the computer can only simulate random number generation, so we can't just tell it "go generate numbers according to a beta distribution". What we can do is simulate numbers a verify fairly easily that they roughly correspond to a uniform distribution, and then apply suitable conditions/transformations to make them follow other distribution. This actually turns out to be fairly difficult beyond the standard distributions; if you ever study Bayesian statistics, you'll learn some fairly complex methods that are required to simulate certain prior distributions (e.g. Metropolis-Hastings).

Hi, NumberNine. Yea from what I learned so far, the first step is always to generate uniform RVs first, then do some transformation to make it to follow the distribution that we want. So we use uniform RV because it is very simple? as opposed to pick exponential RV first? So it is just the way it is?
 
We don't know how to pick an exponential RV first.
 
MIA6 said:
Hi, NumberNine. Yea from what I learned so far, the first step is always to generate uniform RVs first, then do some transformation to make it to follow the distribution that we want. So we use uniform RV because it is very simple? as opposed to pick exponential RV first? So it is just the way it is?

You can't just "pick <blank> random variables". How would you do it?
For extremely simple (i.e. one dimensional) distributions, you could, of course, generate some random numbers and systematically reject some of them so that your collection roughly conformed to some distribution, but this is horribly inefficient.
 
Hi,

The uniform(0,1)distribution, whose values are called 'random numbers', has the property that P{U <= x} = x, 0<x<1.

So if you have a random variable X whose distribution is known,that is its distribution function F(x) is given,then the event {X <= x} has probability F(x)of occurring,which is equal to the probability of the event {U <= F(x)},
since P{X<=x} = F(x) and P{U<=F(x)} = F(x), by using the property above.

note that 0<F(x)<1.

(Very) loosely speaking, the probability of X getting the value x, is the same as U getting the value F(x), where U is a uniform(0,1).
 
MIA6 said:
So we use uniform RV because it is very simple?

Yes, picking pseudo-random number from a uniform distribution is simple in two aspects.

First, the transformation to another random variable involves the "natural" mathematical question of how to invert the cumulative distribution of the other random variable, which is an important question even if people weren't doing random sampling. Even if the formulat for inverting a given cumulative is not simple, it is usually a topic that has been studied and one can look up algorithms to do it.

Second, there are effective and well-studied computer algorithms for generating uniform pseudo random variables (, "linear congruential generators").

It's an interesting question whether there could be other useful algorithms that generated non-uniformly pseudo random numbers without making use of the algoirthms commonly used for generating uniform random numbers. I'm not aware of any mathematical theory that proves such things cannot exist, but I also don't know of any such algorithms.
 
  • #10
Number Nine said:
You can't just "pick <blank> random variables". How would you do it?
For extremely simple (i.e. one dimensional) distributions, you could, of course, generate some random numbers and systematically reject some of them so that your collection roughly conformed to some distribution, but this is horribly inefficient.

Just for the record, there are some quite efficient acceptance-rejection methods for *exact* sampling of arbitrary distributions in higher dimensions (remarkably, even where the pdf normalisation constant is unknown), though of course these are all based on uniform variates.
 

Similar threads

Back
Top