I Density, distribution and derivative relationship (stats)

Andrea94
Messages
21
Reaction score
8
I am currently enrolled in a statistics course, and the following is stated in my course book with no attempt at an explanation:

Suppose that f is the probability density function for the random variable (X,Y), and that F is the distribution function. Then,

f_{X,Y}(x,y)=\frac{\partial^{2} F_{X,Y}(x,y)}{\partial x \partial x}

I have repeatedly tried to find an explanation for why this is so, but all I keep finding is documents from various university statistics courses that just flat out give this result with no attempt at an explanation.

My question is, can you show or explain why the above result is true?
 
Physics news on Phys.org
Hi Andrea,

You call it a result, but perhaps if you study the accompanying text you might find that it's in fact just an expression that reflects the definition of probability density in a formula. (or the other way around: the definition of the cumulative distribution function)

Can you find the equivalent in the single-stochastic variable (univariate, e.g. ##x##) case ? Do you have difficulty with that one ?
 
Hi and thanks for the reply,

In the single-variable case, I think of it in terms of the fundamental theorem of calculus as follows:

F(x) = \int_{-\infty}^{x} f(t)dt<br /> \\\\<br /> F&#039;(x) = f(x)

Even though the definition for the distribution function looks like an obvious extension,

F_{X,Y}(x,y)= \int_{-\infty}^{x}\int_{-\infty}^{y} f(u,v)dudv

I still don't see it as an immediately obvious result from the definition. Is there anything you could provide in this regard that might help?
 
  • Like
Likes EnumaElish
Is this correct?

\frac{\partial}{\partial y}\frac{\partial}{\partial x}\int_{-\infty}^{x}\int_{-\infty}^{y} f(u,v)dudv = \frac{\partial}{\partial y}\int_{-\infty}^{y} f(x,v)dv = f(x,y)

If so, I think I got it
 
Andrea94 said:
Is this correct?

\frac{\partial}{\partial y}\frac{\partial}{\partial x}\int_{-\infty}^{x}\int_{-\infty}^{y} f(u,v)dudv = \frac{\partial}{\partial y}\int_{-\infty}^{y} f(x,v)dv = f(x,y)

If so, I think I got it
This is easily shown to be wrong: if ##f## only depends on one of the two arguments, you get back zero, so not ##f## itself...
Basically a probability for an interval is an integral of probability density over that interval. If the interval size is infinitesimally small then that becomes the probability density times the interval size. (or area if it's in two variables). That's really all there is to it.

https://en.wikipedia.org/wiki/Probability_density_function in particular:
https://en.wikipedia.org/wiki/Proba...bsolutely_continuous_univariate_distributions and
https://en.wikipedia.org/wiki/Cumulative_distribution_function

but I guess you looked there already ?
 
BvU said:
This is easily shown to be wrong: if ##f## only depends on one of the two arguments, you get back zero, so not ##f## itself...
Basically a probability for an interval is an integral of probability density over that interval. If the interval size is infinitesimally small then that becomes the probability density times the interval size. (or area if it's in two variables). That's really all there is to it.

https://en.wikipedia.org/wiki/Probability_density_function in particular:
https://en.wikipedia.org/wiki/Proba...bsolutely_continuous_univariate_distributions and
https://en.wikipedia.org/wiki/Cumulative_distribution_function

but I guess you looked there already ?

Yes I have checked those links before.

I understand how the probability (in the case of a continuous random variable) is an integral of probability density over an interval (or area if it's in 2 variables), I just cannot make the connection between the density function and the distribution function in terms of the mixed partial derivative. I suspect this has more to do with my calculus knowledge than my probability and statistics knowledge.
 
BvU said:
This is easily shown to be wrong: if ##f## only depends on one of the two arguments, you get back zero, so not ##f## itself...

If a joint density ##f(x,y)## depends only on ##x## then what constant value can a number like ##f(7,y)## have on an interval ##[-\infty, y] ## besides zero ?
 
  • Like
Likes BvU
BvU said:
This is easily shown to be wrong: if ##f## only depends on one of the two arguments, you get back zero, so not ##f## itself...
I think what @Stephen Tashi is saying in post #7 is: there is no density on the plane that depends only one argument/variable in rectangular coordinates. In polar coordinates you can sometimes get away with one argument.
 
Last edited:
  • Like
Likes BvU
Andrea94 said:
Is this correct?

\frac{\partial}{\partial y}\frac{\partial}{\partial x}\int_{-\infty}^{x}\int_{-\infty}^{y} f(u,v)dudv = \frac{\partial}{\partial y}\int_{-\infty}^{y} f(x,v)dv = f(x,y)

If so, I think I got it
If f is continuous it is correct and should be in any calculus text treating several variables. E.g. Courant Vol II, page 239.
There is no one natural way to define a distribution function for f on the plane as there almost is on the line.
 
Last edited:
  • Like
Likes jim mcnamara, BvU and Andrea94
  • #10
Zafa Pi said:
If f is continuous it is correct and should be in any calculus text treating several variables. E.g. Courant Vol II, page 239.
There is no one natural way to define a distribution function for f on the plane as there almost is on the line.

Fantastic, thank you so much for the reference! I found the book on Google and looked at page 239. Interestingly, I never saw this anywhere in my multivariable calculus course.
 
  • Like
Likes jim mcnamara and BvU

Similar threads

Back
Top