Multivariate Distributions, Moments, and Correlations


by shaiguy6
Tags: correlations, distributions, moments, multivariate
shaiguy6
shaiguy6 is offline
#1
Mar10-11, 12:29 PM
P: 13
So if I start with a multivariate distribution f(x,y), I can find the marginal distributions, the conditional probability distributions, all conditional moments, and by the law of iterated expectations, the moments of both X and Y.

It seems to me that I should be able to relate the conditional moments in x to the conditional moments in y. Right? This is mainly coming from intuition. To be a bit more clear. If I have a function V(x,t) and it has all the properties of a joint probability distribution, I can begin to describe its shape by finding the conditional moments in X and in T. But it seems like all of the conditional moments in X should be able to recreate the original function V(x,t) just as well as all the conditional moments in T. I was wondering if this makes any sense at all. I'm not all too familiar with statistics, and feel like a huge dilletent


Relatedly, if i want to define a covariance or correlation between my two random variables, x and t, but I only know their joint distribution, V(x,t), then is the way to go about it to comput the first two moments of X and T using the law of iterated expectations, and then find the covariance and correlation that way?


Sorry, one final thing. Having a probability distribution p(x) is equivalent to having the infinity of moments of that distribution. My question is, how can you rebuild the probability distribution given all the moments?

Sorry for my ramblings :)
Any help is appreciated.
Phys.Org News Partner Science news on Phys.org
Cougars' diverse diet helped them survive the Pleistocene mass extinction
Cyber risks can cause disruption on scale of 2008 crisis, study says
Mantis shrimp stronger than airplanes
SW VandeCarr
SW VandeCarr is offline
#2
Mar12-11, 03:35 AM
P: 2,490
Quote Quote by shaiguy6 View Post
But it seems like all of the conditional moments in X should be able to recreate the original function V(x,t) just as well as all the conditional moments in T. I was wondering if this makes any sense at all. I'm not all too familiar with statistics, and feel like a huge dilletent


Relatedly, if i want to define a covariance or correlation between my two random variables, x and t, but I only know their joint distribution, V(x,t), then is the way to go about it to comput the first two moments of X and T using the law of iterated expectations, and then find the covariance and correlation that way?


Sorry, one final thing. Having a probability distribution p(x) is equivalent to having the infinity of moments of that distribution. My question is, how can you rebuild the probability distribution given all the moments?

I'm not exactly sure what you're after, but if you know f(x,y) and f(x) you can find the conditional expectation of Y from:

[tex] E(Y|X=x)=\int y \frac{f(x,y)}{f(x)} dy[/tex]

In general, for a random variable X conditional on an event B: E(X|B) is the sum or integral of the products of all possible values of the RV (or function) and the respective conditional probabilities of each.

EDIT: Note the nth moment is [tex] E(X-x)^n[/tex], n>1 and the covariance E[(X-x)(Y-y)].
shaiguy6
shaiguy6 is offline
#3
Mar12-11, 12:15 PM
P: 13
Quote Quote by SW VandeCarr View Post
I'm not exactly sure what you're after, but if you know f(x,y) and f(x) you can find the conditional expectation of Y from:

[tex] E(Y|X=x)=\int y \frac{f(x,y)}{f(x)} dy[/tex]

In general, for a random variable X conditional on an event B: E(X|B) is the sum or integral of the products of all possible values of the RV (or function) and the respective conditional probabilities of each.

EDIT: Note the nth moment is [tex] E(X-x)^n[/tex], n>1 and the covariance E[(X-x)(Y-y)].

yea thanks, so I've been messing around a bit, and I think I've almost got it figured out (just need to make everything pretty for my specific example). But in general, this is what I have:

Given the moments of either conditional distribution functions, you can recreate the characteristic function. Then, to get back the conditional distribution function, you take the inverse fourier transform of the characteristic function. Since we can easily relate each of the conditional distribution functions to eachother, we can then get the conditional distribution function of the other variable, and then we can get the moments from there. This is how the moments are related to eachother. Going from either of the conditional distributions back to the joint distribution is straightfoward enough. If we keep everything in terms of the original moments, we will derive a relationship between the conditional moments and the joint moments (though in general it doesn't look pretty).

SW VandeCarr
SW VandeCarr is offline
#4
Mar12-11, 01:49 PM
P: 2,490

Multivariate Distributions, Moments, and Correlations


Quote Quote by shaiguy6 View Post
Since we can easily relate each of the conditional distribution functions to eachother, we can then get the conditional distribution function of the other variable, and then we can get the moments from there. This is how the moments are related to eachother. Going from either of the conditional distributions back to the joint distribution is straightfoward enough. If we keep everything in terms of the original moments, we will derive a relationship between the conditional moments and the joint moments (though in general it doesn't look pretty).
If I understand you correctly, if you know f(x) and f(y), you can get f(x,y) using the characteristic function. In general, for any two of the above functions you can obtain the third. Moreover, knowing the functions allows you to obtain the moments. However,it sounds like you're trying to get the functions from knowing only the moments. I'm not sure this is true. Obtaining the expectations (the first moments) E(X|Y=y) or E(Y|X=x) analytically requires you know at least two of the three functions.
shaiguy6
shaiguy6 is offline
#5
Mar12-11, 02:02 PM
P: 13
Quote Quote by SW VandeCarr View Post
If I understand you correctly, if you know f(x) and f(y), you can get f(x,y) using the characteristic function. In general, for any two of the above functions you can obtain the third. Moreover, knowing the functions allows you to obtain the moments. However,it sounds like you're trying to get the functions from knowing only the moments. I'm not sure this is true. Obtaining the expectations (the first moments) E(X|Y=y) or E(Y|X=x) analytically requires you know at least two of the three functions.
i actually start with only the joint distribution, f(x,y). Then I am trying to find a relationship relating the moments of the conditional probability f(x|y) to the moments of the conditional probability f(y|x). I'm running into a slight issue taking the inverse Fourier transform of e^k... Other than that I think the algorithm above works perfect. Wait, finding the expectation of the conditionals only requires knowing f(x,y) I thought.

Because for instance if I have f(x,y) given, I can get the conditional by taking [tex]f(x|y)=\frac{f(x,y)}{\int_{-\infty}^{\infty} f(x,y)dy}[/tex] THen we can find moments of that in the normal fashion


In other words, the marginal distributions f(y) and f(x) can be directly found from f(x,y), unless I am misinterpreting something.

So we have: [tex]
E[f(Y|X=x)]=\int_{-\infty}^{\infty} y \frac{f(x,y)}{\int_{-\infty}^{\infty} f(x,y)dy} dy
[/tex]


Register to reply

Related Discussions
correlations between QM operators Quantum Physics 5
Layperson's description of multivariate gaussian distributions? Set Theory, Logic, Probability, Statistics 11
correlations and variance? General Math 1
Multivariate probability distributions? Calculus & Beyond Homework 3