Probability of Sum of Squares of 2 Uniform RVs < 1

AI Thread Summary
The discussion centers on calculating the probability that the sum of the squares of two independent uniform random variables on the interval [0,1] is less than 1. Participants explore various methods, including transformation techniques to derive the probability density functions (PDFs) of the squared variables and the use of convolution for independent distributions. The joint distribution is noted to be uniform over the unit square, leading to the conclusion that the probability corresponds to the area of a quarter circle, which is π/4. There is some debate about the correctness of the integral approach used to derive this probability, with suggestions for verification from experienced members. Ultimately, the integral method is affirmed as a valid approach to confirm the result.
thisguy12
Messages
3
Reaction score
0
If you were to pick two random numbers on the interval [0,1], what is the probability that the sum of their squares is less than 1? That is, if you let Y_1 ~ U(0,1) and Y_2 ~ U(0,1), find P(Y_1^2 + Y^2_2 \leq 1). There is also a hint: the substitution u = 1 - y_1 may be helpful - look for a beta distribution.


Here's what I've done so far:


I know that the density function for Y_1 and Y_2 is the same, f(y_1) = f (y_2) = 1 on the interval [0,1].

P(Y_1^2 + Y_2^2 \leq 1) = P(Y_2^2 \leq 1 - Y_1^2) = P(-\sqrt{1 - Y_1^2} \leq Y_2 \leq \sqrt{1 - Y_1^2}) = \int^{\sqrt{1 - Y_1^2}}_{-\sqrt{1 - Y_1^2}} dy_2 = 2\sqrt{1 - Y_1^2}

And that's where I get stuck. I thought that maybe be a beta distrbution with \alpha = 3/2, \beta = 1, but the beta function, \beta(3/2, 1) = 2/3 \neq 1/2.
 
Physics news on Phys.org


Hey thisguy12 and welcome to the forums.

The way that comes to my mind is to find the PDF of the square of the uniform distribution U(0,1) and then use the convolution theorem to get the CDF for the sum of the two distributions since both are independent.

In terms of getting the PDF of the square of the uniform distribution, you can use a transformation theorem.

Also your PDF expression in the last line is wrong, since for one you can values that are greater than 1 which makes it invalid.

Do you know how to calculate the pdf of a transformed random variable? You don't necessarily need convolution per se for the second part, but its probably a good idea to be able to find the PDF/CDF of your transformed variable to go to the next step.
 


One easy way. The joint distribution of X and Y is uniform over the unit square.
The condition X2 + Y2 ≤ 1 is simply putting the variable pair inside a quarter circle of radius 1, so the probability is then the area = π/4.
 
Last edited:


chiro said:
Hey thisguy12 and welcome to the forums.

The way that comes to my mind is to find the PDF of the square of the uniform distribution U(0,1) and then use the convolution theorem to get the CDF for the sum of the two distributions since both are independent.

In terms of getting the PDF of the square of the uniform distribution, you can use a transformation theorem.

Also your PDF expression in the last line is wrong, since for one you can values that are greater than 1 which makes it invalid.

Do you know how to calculate the pdf of a transformed random variable? You don't necessarily need convolution per se for the second part, but its probably a good idea to be able to find the PDF/CDF of your transformed variable to go to the next step.

Okay so using the transformation method, I found the density functions for U = Y^2_1 and W = Y^2_2.

h(Y) = U = Y^2_1
h^{-1}(U) = Y_1 = \sqrt{U}
\frac{d[h^{-1}(u)]}{du} = \frac{1}{2\sqrt{u}}
f_U(u) = f_{Y_1}(\sqrt{u}) \frac{1}{2\sqrt{u}} = 1(\frac{1}{2\sqrt{u}})=\frac{1}{2\sqrt{u}}

Using the same method, I also get f_W(w) = \frac{1}{2\sqrt{w}}.

So now I am looking for P(W + Y \leq 1). The joint distribution function of two independent random variables is the product of their two marginal density functions, f(x_1, x_2) = f_{x_1}(x_1)f_{x_2}(x_2).

Thus, f(u, w) = f_u(u)f_w(w) = (\frac{1}{2\sqrt{u}})(\frac{1}{2\sqrt{w}}) = \frac{1}{4\sqrt{uw}}. So P(W + Y \leq 1) = \int^1_0 \int^{1-u}_0 f(u, w) dw du.

Is this method correct? If so, this integral should give me the correct answer, right?
 


thisguy12 said:
Okay so using the transformation method, I found the density functions for U = Y^2_1 and W = Y^2_2.

h(Y) = U = Y^2_1
h^{-1}(U) = Y_1 = \sqrt{U}
\frac{d[h^{-1}(u)]}{du} = \frac{1}{2\sqrt{u}}
f_U(u) = f_{Y_1}(\sqrt{u}) \frac{1}{2\sqrt{u}} = 1(\frac{1}{2\sqrt{u}})=\frac{1}{2\sqrt{u}}

Using the same method, I also get f_W(w) = \frac{1}{2\sqrt{w}}.

So now I am looking for P(W + Y \leq 1). The joint distribution function of two independent random variables is the product of their two marginal density functions, f(x_1, x_2) = f_{x_1}(x_1)f_{x_2}(x_2).

Thus, f(u, w) = f_u(u)f_w(w) = (\frac{1}{2\sqrt{u}})(\frac{1}{2\sqrt{w}}) = \frac{1}{4\sqrt{uw}}. So P(W + Y \leq 1) = \int^1_0 \int^{1-u}_0 f(u, w) dw du.

Is this method correct? If so, this integral should give me the correct answer, right?

That looks pretty good to me.

It would be nice though to get another opinion on this from an experienced member just to be sure.
 


If you get π/4 from the double integral, then it is right, although it looks like a difficult way to get an easy answer.
 
I was reading a Bachelor thesis on Peano Arithmetic (PA). PA has the following axioms (not including the induction schema): $$\begin{align} & (A1) ~~~~ \forall x \neg (x + 1 = 0) \nonumber \\ & (A2) ~~~~ \forall xy (x + 1 =y + 1 \to x = y) \nonumber \\ & (A3) ~~~~ \forall x (x + 0 = x) \nonumber \\ & (A4) ~~~~ \forall xy (x + (y +1) = (x + y ) + 1) \nonumber \\ & (A5) ~~~~ \forall x (x \cdot 0 = 0) \nonumber \\ & (A6) ~~~~ \forall xy (x \cdot (y + 1) = (x \cdot y) + x) \nonumber...
Back
Top