Probability of Sum of Squares of 2 Uniform RVs < 1

thisguy12
Messages
3
Reaction score
0
If you were to pick two random numbers on the interval [0,1], what is the probability that the sum of their squares is less than 1? That is, if you let Y_1 ~ U(0,1) and Y_2 ~ U(0,1), find P(Y_1^2 + Y^2_2 \leq 1). There is also a hint: the substitution u = 1 - y_1 may be helpful - look for a beta distribution.


Here's what I've done so far:


I know that the density function for Y_1 and Y_2 is the same, f(y_1) = f (y_2) = 1 on the interval [0,1].

P(Y_1^2 + Y_2^2 \leq 1) = P(Y_2^2 \leq 1 - Y_1^2) = P(-\sqrt{1 - Y_1^2} \leq Y_2 \leq \sqrt{1 - Y_1^2}) = \int^{\sqrt{1 - Y_1^2}}_{-\sqrt{1 - Y_1^2}} dy_2 = 2\sqrt{1 - Y_1^2}

And that's where I get stuck. I thought that maybe be a beta distrbution with \alpha = 3/2, \beta = 1, but the beta function, \beta(3/2, 1) = 2/3 \neq 1/2.
 
Physics news on Phys.org


Hey thisguy12 and welcome to the forums.

The way that comes to my mind is to find the PDF of the square of the uniform distribution U(0,1) and then use the convolution theorem to get the CDF for the sum of the two distributions since both are independent.

In terms of getting the PDF of the square of the uniform distribution, you can use a transformation theorem.

Also your PDF expression in the last line is wrong, since for one you can values that are greater than 1 which makes it invalid.

Do you know how to calculate the pdf of a transformed random variable? You don't necessarily need convolution per se for the second part, but its probably a good idea to be able to find the PDF/CDF of your transformed variable to go to the next step.
 


One easy way. The joint distribution of X and Y is uniform over the unit square.
The condition X2 + Y2 ≤ 1 is simply putting the variable pair inside a quarter circle of radius 1, so the probability is then the area = π/4.
 
Last edited:


chiro said:
Hey thisguy12 and welcome to the forums.

The way that comes to my mind is to find the PDF of the square of the uniform distribution U(0,1) and then use the convolution theorem to get the CDF for the sum of the two distributions since both are independent.

In terms of getting the PDF of the square of the uniform distribution, you can use a transformation theorem.

Also your PDF expression in the last line is wrong, since for one you can values that are greater than 1 which makes it invalid.

Do you know how to calculate the pdf of a transformed random variable? You don't necessarily need convolution per se for the second part, but its probably a good idea to be able to find the PDF/CDF of your transformed variable to go to the next step.

Okay so using the transformation method, I found the density functions for U = Y^2_1 and W = Y^2_2.

h(Y) = U = Y^2_1
h^{-1}(U) = Y_1 = \sqrt{U}
\frac{d[h^{-1}(u)]}{du} = \frac{1}{2\sqrt{u}}
f_U(u) = f_{Y_1}(\sqrt{u}) \frac{1}{2\sqrt{u}} = 1(\frac{1}{2\sqrt{u}})=\frac{1}{2\sqrt{u}}

Using the same method, I also get f_W(w) = \frac{1}{2\sqrt{w}}.

So now I am looking for P(W + Y \leq 1). The joint distribution function of two independent random variables is the product of their two marginal density functions, f(x_1, x_2) = f_{x_1}(x_1)f_{x_2}(x_2).

Thus, f(u, w) = f_u(u)f_w(w) = (\frac{1}{2\sqrt{u}})(\frac{1}{2\sqrt{w}}) = \frac{1}{4\sqrt{uw}}. So P(W + Y \leq 1) = \int^1_0 \int^{1-u}_0 f(u, w) dw du.

Is this method correct? If so, this integral should give me the correct answer, right?
 


thisguy12 said:
Okay so using the transformation method, I found the density functions for U = Y^2_1 and W = Y^2_2.

h(Y) = U = Y^2_1
h^{-1}(U) = Y_1 = \sqrt{U}
\frac{d[h^{-1}(u)]}{du} = \frac{1}{2\sqrt{u}}
f_U(u) = f_{Y_1}(\sqrt{u}) \frac{1}{2\sqrt{u}} = 1(\frac{1}{2\sqrt{u}})=\frac{1}{2\sqrt{u}}

Using the same method, I also get f_W(w) = \frac{1}{2\sqrt{w}}.

So now I am looking for P(W + Y \leq 1). The joint distribution function of two independent random variables is the product of their two marginal density functions, f(x_1, x_2) = f_{x_1}(x_1)f_{x_2}(x_2).

Thus, f(u, w) = f_u(u)f_w(w) = (\frac{1}{2\sqrt{u}})(\frac{1}{2\sqrt{w}}) = \frac{1}{4\sqrt{uw}}. So P(W + Y \leq 1) = \int^1_0 \int^{1-u}_0 f(u, w) dw du.

Is this method correct? If so, this integral should give me the correct answer, right?

That looks pretty good to me.

It would be nice though to get another opinion on this from an experienced member just to be sure.
 


If you get π/4 from the double integral, then it is right, although it looks like a difficult way to get an easy answer.
 
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Back
Top