# Marginal Density of Coordinates Inside an Ellipse

1. Feb 6, 2013

### gajohnson

1. The problem statement, all variables and given/known data

A point is chosen randomly in the interior of an ellipse:

(x/a)^2 + (y/b)^2 = 1

Find the marginal densities of the X and Y coordinates of the points.

2. Relevant equations

NA

3. The attempt at a solution

So this ought to be uniformly distributed, thus the density function for $(x,y)$ is $f_{x,y}$ = $1/∏ab$ (where ∏ab is the area of the ellipse)

So, to find the marginal density for x (and later for y), I realize that I just need to find the limits of integration and then go about my business. I believe that the limits of integration are
$-(b/a)\sqrt{a^2 - x^2}$ and $b/a\sqrt{a^2 - x^2}$,

since these should be the minimum and maximum values that y can take for any given x. Are these limits of integration and/or is my reasoning correct?

As always, many thanks to all of you wonderful Homework Helpers!

Last edited: Feb 6, 2013
2. Feb 7, 2013

### clamtrox

That seems entirely correct.

3. Feb 7, 2013

### gajohnson

Excellent, thanks!