- #1

slakedlime

- 76

- 2

## Homework Statement

A point is uniformly distributed within the disk of radius 1.

That is, its density is f(x,y) = C

For 0 ≤ x

^{2}+ y

^{2}≤ 1

Find the probability that its distance from the origin is less than x, 0 ≤ x ≤ 1.

[Note] My book says that the answer is supposed to be x

^{2}.

**2. The attempt at a solution**

Let D be the distance of the point from the origin.

D = x

^{2}+ y

^{2}, where the x and y values correspond to the point's x and y components respectively.

My interpretation is that the unit circle is centered at origin 0,0.

I can see that D can be GREATER than the point's x component (e.g. as we move away from the x-axis and towards the y-axis). However, how can D be less than x? Isn't this x the same as the circle's x-component? Or is this x independent of the point? For example, we can assume that x is not the same as the circle's x-component (let's denote that as ω). Hence, 0 ≤ x ≤ 1 and 0 ≤ ω ≤ 1, where x may or may not be equal to ω.

If this is the case, do we have to find the probability that ω

^{2}+ y

^{2}< x?

Any explanations or guidance on how to proceed with this problem will be immensely appreciated!

Last edited: