Determining the distribution of a variable

musicgold
Messages
303
Reaction score
19
Hi,

I am trying to understand how to predict the distribution of a variable that is a combination of other random variables. For example, how should I determine the distribution of Variable A, which is the sum of Variable X and Variable Y? Variable X has a normal distribution and Variable B has a uniform distribution.

Also, what will be the distribution of Variable B, which is a product of X and Y?

Thanks.
 
Physics news on Phys.org
musicgold said:
I am trying to understand how to predict the distribution of a variable that is a combination of other random variables.

The very general idea is that if you want to know the probability that A = r and A is a function of X and Y, then you must add up the probabilities of all the combinations of values of X and Y that make the function equal to r.

In the case of continuous random variables, the "adding up" is done by integration instead of taking a finite sum.

In the special case where A is the sum of X and Y, the process of doing the summation or integration is called a "convolution".

You can think of the convolution that computes Pr( A = X + Y = r) as doing the summation over all values x and y such at x + y = r. So if we take y as given then x = r - y.

The summation is :

The sum over all possible values of y of Pr( X = r-y and Y = y )

If X and Y are independent then the joint density Pr(X = r-y and Y =y ) is Pr(X=r-y) Pr(Y=y).

For continuous random variables, if X and Y are independent and X has density f(x) and Y has density g(y) the density h(r) of A = X + Y is

h(r) = \int f(r-y) g(y) \ dy


Do you feel up to doing the calculus to solve your particular example? I think the answer will involve the cumulative of the normal distribution so it isn't a "closed form" solution.

For the case of the product of two random variables A = (X)(Y) you need to do a calculation that sums (or integrates) over all possible values of x,y where xy = r. So x = r/y.
 
If you're only interested in the moments of A = X + Y, then the moment generating function becomes very useful because the MGF of A will be the product of the MGFs of X and Y (provided their independent).

If you have access to a good table of integral transforms, or Mathematica/Maple this is also an easy way to get the pdf for your variable. For example to do the example you asked about with the uniform distribution going over the interval from 0 to 1 and the normal variable having mean 0 and variance 1 you'd just do

Code:
InverseFourierTransform[(FourierTransform[
    UnitStep[x] - UnitStep[x - 1], x, t] FourierTransform[Exp[-x^2], 
    x, t]), t, x]

And Mathematica spits out

\frac{\text{Erf}[1-x]+\text{Erf}[x]}{2 \sqrt{2}}

Note above I use the characteristic function, not mgfs of the pdf but it's the same idea.
 
Last edited:
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...

Similar threads

Back
Top