Probability of Sum of Squares of 2 Uniform RVs < 1

Click For Summary
SUMMARY

The probability that the sum of the squares of two independent uniform random variables Y_1 and Y_2, both distributed as U(0,1), is less than 1 is calculated using integration techniques. The correct approach involves finding the joint distribution of the squared variables and integrating over the appropriate region. The final result confirms that P(Y_1^2 + Y_2^2 ≤ 1) equals π/4, derived from the area of a quarter circle. The transformation method for deriving the probability density functions (PDFs) of the squared variables is essential in this calculation.

PREREQUISITES
  • Understanding of uniform random variables, specifically U(0,1)
  • Knowledge of probability density functions (PDFs) and cumulative distribution functions (CDFs)
  • Familiarity with integration techniques in probability
  • Ability to apply transformation theorems in probability theory
NEXT STEPS
  • Study the transformation theorem for random variables in depth
  • Learn about the convolution theorem and its applications in probability
  • Explore the properties of the beta distribution and its relationship to uniform distributions
  • Practice calculating probabilities using double integrals in two-dimensional probability spaces
USEFUL FOR

Mathematicians, statisticians, and students studying probability theory, particularly those interested in the properties of random variables and integration techniques in probability.

thisguy12
Messages
3
Reaction score
0
If you were to pick two random numbers on the interval [0,1], what is the probability that the sum of their squares is less than 1? That is, if you let Y_1 ~ U(0,1) and Y_2 ~ U(0,1), find P(Y_1^2 + Y^2_2 \leq 1). There is also a hint: the substitution u = 1 - y_1 may be helpful - look for a beta distribution.


Here's what I've done so far:


I know that the density function for Y_1 and Y_2 is the same, f(y_1) = f (y_2) = 1 on the interval [0,1].

P(Y_1^2 + Y_2^2 \leq 1) = P(Y_2^2 \leq 1 - Y_1^2) = P(-\sqrt{1 - Y_1^2} \leq Y_2 \leq \sqrt{1 - Y_1^2}) = \int^{\sqrt{1 - Y_1^2}}_{-\sqrt{1 - Y_1^2}} dy_2 = 2\sqrt{1 - Y_1^2}

And that's where I get stuck. I thought that maybe be a beta distrbution with \alpha = 3/2, \beta = 1, but the beta function, \beta(3/2, 1) = 2/3 \neq 1/2.
 
Physics news on Phys.org


Hey thisguy12 and welcome to the forums.

The way that comes to my mind is to find the PDF of the square of the uniform distribution U(0,1) and then use the convolution theorem to get the CDF for the sum of the two distributions since both are independent.

In terms of getting the PDF of the square of the uniform distribution, you can use a transformation theorem.

Also your PDF expression in the last line is wrong, since for one you can values that are greater than 1 which makes it invalid.

Do you know how to calculate the pdf of a transformed random variable? You don't necessarily need convolution per se for the second part, but its probably a good idea to be able to find the PDF/CDF of your transformed variable to go to the next step.
 


One easy way. The joint distribution of X and Y is uniform over the unit square.
The condition X2 + Y2 ≤ 1 is simply putting the variable pair inside a quarter circle of radius 1, so the probability is then the area = π/4.
 
Last edited:


chiro said:
Hey thisguy12 and welcome to the forums.

The way that comes to my mind is to find the PDF of the square of the uniform distribution U(0,1) and then use the convolution theorem to get the CDF for the sum of the two distributions since both are independent.

In terms of getting the PDF of the square of the uniform distribution, you can use a transformation theorem.

Also your PDF expression in the last line is wrong, since for one you can values that are greater than 1 which makes it invalid.

Do you know how to calculate the pdf of a transformed random variable? You don't necessarily need convolution per se for the second part, but its probably a good idea to be able to find the PDF/CDF of your transformed variable to go to the next step.

Okay so using the transformation method, I found the density functions for U = Y^2_1 and W = Y^2_2.

h(Y) = U = Y^2_1
h^{-1}(U) = Y_1 = \sqrt{U}
\frac{d[h^{-1}(u)]}{du} = \frac{1}{2\sqrt{u}}
f_U(u) = f_{Y_1}(\sqrt{u}) \frac{1}{2\sqrt{u}} = 1(\frac{1}{2\sqrt{u}})=\frac{1}{2\sqrt{u}}

Using the same method, I also get f_W(w) = \frac{1}{2\sqrt{w}}.

So now I am looking for P(W + Y \leq 1). The joint distribution function of two independent random variables is the product of their two marginal density functions, f(x_1, x_2) = f_{x_1}(x_1)f_{x_2}(x_2).

Thus, f(u, w) = f_u(u)f_w(w) = (\frac{1}{2\sqrt{u}})(\frac{1}{2\sqrt{w}}) = \frac{1}{4\sqrt{uw}}. So P(W + Y \leq 1) = \int^1_0 \int^{1-u}_0 f(u, w) dw du.

Is this method correct? If so, this integral should give me the correct answer, right?
 


thisguy12 said:
Okay so using the transformation method, I found the density functions for U = Y^2_1 and W = Y^2_2.

h(Y) = U = Y^2_1
h^{-1}(U) = Y_1 = \sqrt{U}
\frac{d[h^{-1}(u)]}{du} = \frac{1}{2\sqrt{u}}
f_U(u) = f_{Y_1}(\sqrt{u}) \frac{1}{2\sqrt{u}} = 1(\frac{1}{2\sqrt{u}})=\frac{1}{2\sqrt{u}}

Using the same method, I also get f_W(w) = \frac{1}{2\sqrt{w}}.

So now I am looking for P(W + Y \leq 1). The joint distribution function of two independent random variables is the product of their two marginal density functions, f(x_1, x_2) = f_{x_1}(x_1)f_{x_2}(x_2).

Thus, f(u, w) = f_u(u)f_w(w) = (\frac{1}{2\sqrt{u}})(\frac{1}{2\sqrt{w}}) = \frac{1}{4\sqrt{uw}}. So P(W + Y \leq 1) = \int^1_0 \int^{1-u}_0 f(u, w) dw du.

Is this method correct? If so, this integral should give me the correct answer, right?

That looks pretty good to me.

It would be nice though to get another opinion on this from an experienced member just to be sure.
 


If you get π/4 from the double integral, then it is right, although it looks like a difficult way to get an easy answer.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K