# I Density, distribution and derivative relationship (stats)

1. May 15, 2017

### Andrea94

I am currently enrolled in a statistics course, and the following is stated in my course book with no attempt at an explanation:

Suppose that f is the probability density function for the random variable (X,Y), and that F is the distribution function. Then,

$$f_{X,Y}(x,y)=\frac{\partial^{2} F_{X,Y}(x,y)}{\partial x \partial x}$$

I have repeatedly tried to find an explanation for why this is so, but all I keep finding is documents from various university statistics courses that just flat out give this result with no attempt at an explanation.

My question is, can you show or explain why the above result is true?

2. May 15, 2017

### BvU

Hi Andrea,

You call it a result, but perhaps if you study the accompanying text you might find that it's in fact just an expression that reflects the definition of probability density in a formula. (or the other way around: the definition of the cumulative distribution function)

Can you find the equivalent in the single-stochastic variable (univariate, e.g. $x$) case ? Do you have difficulty with that one ?

3. May 15, 2017

### Andrea94

Hi and thanks for the reply,

In the single-variable case, I think of it in terms of the fundamental theorem of calculus as follows:

$$F(x) = \int_{-\infty}^{x} f(t)dt \\\\ F'(x) = f(x)$$

Even though the definition for the distribution function looks like an obvious extension,

$$F_{X,Y}(x,y)= \int_{-\infty}^{x}\int_{-\infty}^{y} f(u,v)dudv$$

I still don't see it as an immediately obvious result from the definition. Is there anything you could provide in this regard that might help?

4. May 15, 2017

### Andrea94

Is this correct?

$$\frac{\partial}{\partial y}\frac{\partial}{\partial x}\int_{-\infty}^{x}\int_{-\infty}^{y} f(u,v)dudv = \frac{\partial}{\partial y}\int_{-\infty}^{y} f(x,v)dv = f(x,y)$$

If so, I think I got it

5. May 15, 2017

### BvU

This is easily shown to be wrong: if $f$ only depends on one of the two arguments, you get back zero, so not $f$ itself...

Basically a probability for an interval is an integral of probability density over that interval. If the interval size is infinitesimally small then that becomes the probability density times the interval size. (or area if it's in two variables). That's really all there is to it.

https://en.wikipedia.org/wiki/Probability_density_function in particular:
https://en.wikipedia.org/wiki/Proba...bsolutely_continuous_univariate_distributions and
https://en.wikipedia.org/wiki/Cumulative_distribution_function

but I guess you looked there already ?

6. May 15, 2017

### Andrea94

Yes I have checked those links before.

I understand how the probability (in the case of a continuous random variable) is an integral of probability density over an interval (or area if it's in 2 variables), I just cannot make the connection between the density function and the distribution function in terms of the mixed partial derivative. I suspect this has more to do with my calculus knowledge than my probability and statistics knowledge.

7. May 15, 2017

### Stephen Tashi

If a joint density $f(x,y)$ depends only on $x$ then what constant value can a number like $f(7,y)$ have on an interval $[-\infty, y]$ besides zero ?

8. May 15, 2017

### Zafa Pi

I think what @Stephen Tashi is saying in post #7 is: there is no density on the plane that depends only one argument/variable in rectangular coordinates. In polar coordinates you can sometimes get away with one argument.

Last edited: May 16, 2017
9. May 16, 2017

### Zafa Pi

If f is continuous it is correct and should be in any calculus text treating several variables. E.g. Courant Vol II, page 239.
There is no one natural way to define a distribution function for f on the plane as there almost is on the line.

Last edited: May 16, 2017
10. May 16, 2017

### Andrea94

Fantastic, thank you so much for the reference! I found the book on Google and looked at page 239. Interestingly, I never saw this anywhere in my multivariable calculus course.