Sum of 2 Non-identical Uniforms RVs?

  • Context: Graduate 
  • Thread starter Thread starter zli034
  • Start date Start date
  • Tags Tags
    Sum
Click For Summary
SUMMARY

The discussion focuses on deriving the probability mass function (pmf) and probability density function (pdf) for the sum of two non-identically distributed uniform random variables, specifically X uniformly distributed over (0,1) and Y uniformly distributed over (-9,0). The sum Z = X + Y can be analyzed using characteristic functions, where the characteristic function of Z is expressed as the product of the individual characteristic functions of X and Y. To obtain the pdf of Z, one must perform an inverse Fourier transform on the resulting expression, which may yield a combination of the two uniform distributions adjusted to ensure the total area equals 1.

PREREQUISITES
  • Understanding of uniform random variables and their properties
  • Familiarity with characteristic functions in probability theory
  • Knowledge of Fourier transforms and their applications in probability
  • Ability to perform inverse Fourier transforms for continuous distributions
NEXT STEPS
  • Study the derivation of characteristic functions for uniform random variables
  • Learn how to perform inverse Fourier transforms in probability distributions
  • Explore the properties of the sum of independent random variables
  • Investigate the implications of non-overlapping ranges in uniform distributions
USEFUL FOR

Statisticians, data scientists, and mathematicians interested in probability theory, particularly those working with random variables and their distributions.

zli034
Messages
106
Reaction score
0
Does anyone know to formulate the pmf and pdf of sum 2 uniform random variables of non-identically distributed?

Say rv X is uniformly distributed range (0,1), and rv Y is uniformly distribute range (-9,0).
For Z = X+Y, what is the probability distribution of Z?

Thanks in advance. So many things are just nice to know.
 
Physics news on Phys.org
For independently distributed random variables [itex]x_i[/itex], the distribution of a sum of these variables,

[tex]z = a_1x_1 + \dots a_N x_N[/tex]

can be found using characteristic functions. The characteristic function of a random variable is given by

[tex]\varphi_{X_k}(t) = \langle \exp(ix_k t)\rangle[/tex]
i.e., the expectation value of exp(ix_k t). Note that this is just a Fourier transform for continuous variables and a Fourier series for discrete variables, so the original pdf can be recovered by performing an inverse Fourier transform.

Given this, the characteristic funtion of z is

[tex]\varphi_Z(t) = \prod_{k=1}^N \varphi_{X_k}(a_kt)[/tex]

So, to get the pdf of z, inverse Fourier transform the resulting expression.

Try applying this to your specific case of two different uniform variables. (I'm not sure off the top of my head if the inverse FT can be performed in closed form. It will involve the product of two sinc functions. Intuitively, though, I would expect the result for the specific example to be the two separate uniform distributions stitched together and the area renormalized to 1, since the ranges do not overlap)
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 36 ·
2
Replies
36
Views
5K