Bre Ntt
- 5
- 0
I'm taking a probability class where multivariate calculus was not a prerequisite, but some of it is coming up, I get the concept of, say integrating over a region, but get lost in some of the mechanics
Here is the problem (I don't want a full solution):
A point is uniformly distributed within the disk of radius 1. That is its density is
<br /> f(x,y) = C \hspace{1cm} 0 \leq x^2 + y^2 \leq 1<br />
Find the probability that its distance from the origin is less than x, 0 \leq x \leq 1
I'm pretty sure I have to set up an integral that integrates over a disc of radius x to get the probability
Something like this
<br /> \int_A \int_B C \, dx \, dy<br />
But I don't know what the intervals A and B are supposed to be.
Can someone point me in the right direction? I get confused because my attempts end up with x being involved in the limit of integration, but x is the dummy variable, which doesn't seem right.
Here is the problem (I don't want a full solution):
A point is uniformly distributed within the disk of radius 1. That is its density is
<br /> f(x,y) = C \hspace{1cm} 0 \leq x^2 + y^2 \leq 1<br />
Find the probability that its distance from the origin is less than x, 0 \leq x \leq 1
I'm pretty sure I have to set up an integral that integrates over a disc of radius x to get the probability
Something like this
<br /> \int_A \int_B C \, dx \, dy<br />
But I don't know what the intervals A and B are supposed to be.
Can someone point me in the right direction? I get confused because my attempts end up with x being involved in the limit of integration, but x is the dummy variable, which doesn't seem right.
Last edited: