Determining the distribution of a variable

  • Context: Graduate 
  • Thread starter Thread starter musicgold
  • Start date Start date
  • Tags Tags
    Distribution Variable
Click For Summary
SUMMARY

This discussion focuses on determining the distribution of a variable that is a combination of other random variables, specifically Variable A, which is the sum of Variable X (normally distributed) and Variable Y (uniformly distributed). The convolution process is essential for calculating the probability density function (pdf) of A, represented as h(r) = ∫ f(r-y) g(y) dy for independent continuous variables. For the product of two random variables, A = X * Y, a similar integration approach is required, where x = r/y. Tools like Mathematica or Maple can facilitate these calculations, particularly for complex distributions.

PREREQUISITES
  • Understanding of convolution in probability theory
  • Familiarity with probability density functions (pdfs)
  • Knowledge of moment generating functions (MGFs)
  • Basic calculus skills for integration
NEXT STEPS
  • Learn about convolution of probability distributions
  • Study the properties and applications of moment generating functions (MGFs)
  • Explore the use of Mathematica for statistical calculations
  • Investigate the characteristics of normal and uniform distributions
USEFUL FOR

Statisticians, data scientists, and anyone involved in probabilistic modeling or statistical analysis will benefit from this discussion.

musicgold
Messages
303
Reaction score
19
Hi,

I am trying to understand how to predict the distribution of a variable that is a combination of other random variables. For example, how should I determine the distribution of Variable A, which is the sum of Variable X and Variable Y? Variable X has a normal distribution and Variable B has a uniform distribution.

Also, what will be the distribution of Variable B, which is a product of X and Y?

Thanks.
 
Physics news on Phys.org
musicgold said:
I am trying to understand how to predict the distribution of a variable that is a combination of other random variables.

The very general idea is that if you want to know the probability that A = r and A is a function of X and Y, then you must add up the probabilities of all the combinations of values of X and Y that make the function equal to r.

In the case of continuous random variables, the "adding up" is done by integration instead of taking a finite sum.

In the special case where A is the sum of X and Y, the process of doing the summation or integration is called a "convolution".

You can think of the convolution that computes Pr( A = X + Y = r) as doing the summation over all values x and y such at x + y = r. So if we take y as given then x = r - y.

The summation is :

The sum over all possible values of y of Pr( X = r-y and Y = y )

If X and Y are independent then the joint density Pr(X = r-y and Y =y ) is Pr(X=r-y) Pr(Y=y).

For continuous random variables, if X and Y are independent and X has density f(x) and Y has density g(y) the density h(r) of A = X + Y is

[tex]h(r) = \int f(r-y) g(y) \ dy[/tex]


Do you feel up to doing the calculus to solve your particular example? I think the answer will involve the cumulative of the normal distribution so it isn't a "closed form" solution.

For the case of the product of two random variables A = (X)(Y) you need to do a calculation that sums (or integrates) over all possible values of x,y where xy = r. So x = r/y.
 
If you're only interested in the moments of A = X + Y, then the moment generating function becomes very useful because the MGF of A will be the product of the MGFs of X and Y (provided their independent).

If you have access to a good table of integral transforms, or Mathematica/Maple this is also an easy way to get the pdf for your variable. For example to do the example you asked about with the uniform distribution going over the interval from 0 to 1 and the normal variable having mean 0 and variance 1 you'd just do

Code:
InverseFourierTransform[(FourierTransform[
    UnitStep[x] - UnitStep[x - 1], x, t] FourierTransform[Exp[-x^2], 
    x, t]), t, x]

And Mathematica spits out

[tex]\frac{\text{Erf}[1-x]+\text{Erf}[x]}{2 \sqrt{2}}[/tex]

Note above I use the characteristic function, not mgfs of the pdf but it's the same idea.
 
Last edited:

Similar threads

  • · Replies 30 ·
2
Replies
30
Views
5K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K