Multivariate Distributions, Moments, and Correlations

Click For Summary
SUMMARY

This discussion focuses on the relationships between multivariate distributions, specifically the joint distribution f(x,y) and its conditional distributions. The law of iterated expectations is utilized to derive moments and correlations between random variables X and T. Key insights include the ability to reconstruct conditional distributions from moments and the necessity of knowing at least two functions to derive expectations. The conversation also emphasizes the role of characteristic functions in relating moments and distributions.

PREREQUISITES
  • Understanding of multivariate distributions and joint probability functions.
  • Familiarity with the law of iterated expectations and conditional probability.
  • Knowledge of characteristic functions and their role in probability theory.
  • Basic concepts of moments in statistics, including nth moments and covariance.
NEXT STEPS
  • Study the derivation of conditional distributions from joint distributions using f(x,y).
  • Learn about the inverse Fourier transform and its application in probability theory.
  • Explore the relationship between moments and characteristic functions in multivariate statistics.
  • Investigate methods for calculating covariance and correlation from joint distributions.
USEFUL FOR

Statisticians, data analysts, and researchers working with multivariate distributions, as well as students seeking to deepen their understanding of conditional probability and statistical moments.

shaiguy6
Messages
13
Reaction score
0
So if I start with a multivariate distribution f(x,y), I can find the marginal distributions, the conditional probability distributions, all conditional moments, and by the law of iterated expectations, the moments of both X and Y.

It seems to me that I should be able to relate the conditional moments in x to the conditional moments in y. Right? This is mainly coming from intuition. To be a bit more clear. If I have a function V(x,t) and it has all the properties of a joint probability distribution, I can begin to describe its shape by finding the conditional moments in X and in T. But it seems like all of the conditional moments in X should be able to recreate the original function V(x,t) just as well as all the conditional moments in T. I was wondering if this makes any sense at all. I'm not all too familiar with statistics, and feel like a huge dilletent :blushing:Relatedly, if i want to define a covariance or correlation between my two random variables, x and t, but I only know their joint distribution, V(x,t), then is the way to go about it to comput the first two moments of X and T using the law of iterated expectations, and then find the covariance and correlation that way?Sorry, one final thing. Having a probability distribution p(x) is equivalent to having the infinity of moments of that distribution. My question is, how can you rebuild the probability distribution given all the moments?

Sorry for my ramblings :)
Any help is appreciated.
 
Last edited:
Physics news on Phys.org
shaiguy6 said:
But it seems like all of the conditional moments in X should be able to recreate the original function V(x,t) just as well as all the conditional moments in T. I was wondering if this makes any sense at all. I'm not all too familiar with statistics, and feel like a huge dilletent :blushing:Relatedly, if i want to define a covariance or correlation between my two random variables, x and t, but I only know their joint distribution, V(x,t), then is the way to go about it to comput the first two moments of X and T using the law of iterated expectations, and then find the covariance and correlation that way?Sorry, one final thing. Having a probability distribution p(x) is equivalent to having the infinity of moments of that distribution. My question is, how can you rebuild the probability distribution given all the moments?
I'm not exactly sure what you're after, but if you know f(x,y) and f(x) you can find the conditional expectation of Y from:

E(Y|X=x)=\int y \frac{f(x,y)}{f(x)} dy

In general, for a random variable X conditional on an event B: E(X|B) is the sum or integral of the products of all possible values of the RV (or function) and the respective conditional probabilities of each.

EDIT: Note the nth moment is E(X-x)^n, n>1 and the covariance E[(X-x)(Y-y)].
 
Last edited:
SW VandeCarr said:
I'm not exactly sure what you're after, but if you know f(x,y) and f(x) you can find the conditional expectation of Y from:

E(Y|X=x)=\int y \frac{f(x,y)}{f(x)} dy

In general, for a random variable X conditional on an event B: E(X|B) is the sum or integral of the products of all possible values of the RV (or function) and the respective conditional probabilities of each.

EDIT: Note the nth moment is E(X-x)^n, n>1 and the covariance E[(X-x)(Y-y)].


yea thanks, so I've been messing around a bit, and I think I've almost got it figured out (just need to make everything pretty for my specific example). But in general, this is what I have:

Given the moments of either conditional distribution functions, you can recreate the characteristic function. Then, to get back the conditional distribution function, you take the inverse Fourier transform of the characteristic function. Since we can easily relate each of the conditional distribution functions to each other, we can then get the conditional distribution function of the other variable, and then we can get the moments from there. This is how the moments are related to each other. Going from either of the conditional distributions back to the joint distribution is straightfoward enough. If we keep everything in terms of the original moments, we will derive a relationship between the conditional moments and the joint moments (though in general it doesn't look pretty).
 
shaiguy6 said:
Since we can easily relate each of the conditional distribution functions to each other, we can then get the conditional distribution function of the other variable, and then we can get the moments from there. This is how the moments are related to each other. Going from either of the conditional distributions back to the joint distribution is straightfoward enough. If we keep everything in terms of the original moments, we will derive a relationship between the conditional moments and the joint moments (though in general it doesn't look pretty).

If I understand you correctly, if you know f(x) and f(y), you can get f(x,y) using the characteristic function. In general, for any two of the above functions you can obtain the third. Moreover, knowing the functions allows you to obtain the moments. However,it sounds like you're trying to get the functions from knowing only the moments. I'm not sure this is true. Obtaining the expectations (the first moments) E(X|Y=y) or E(Y|X=x) analytically requires you know at least two of the three functions.
 
SW VandeCarr said:
If I understand you correctly, if you know f(x) and f(y), you can get f(x,y) using the characteristic function. In general, for any two of the above functions you can obtain the third. Moreover, knowing the functions allows you to obtain the moments. However,it sounds like you're trying to get the functions from knowing only the moments. I'm not sure this is true. Obtaining the expectations (the first moments) E(X|Y=y) or E(Y|X=x) analytically requires you know at least two of the three functions.

i actually start with only the joint distribution, f(x,y). Then I am trying to find a relationship relating the moments of the conditional probability f(x|y) to the moments of the conditional probability f(y|x). I'm running into a slight issue taking the inverse Fourier transform of e^k... Other than that I think the algorithm above works perfect. Wait, finding the expectation of the conditionals only requires knowing f(x,y) I thought.

Because for instance if I have f(x,y) given, I can get the conditional by taking f(x|y)=\frac{f(x,y)}{\int_{-\infty}^{\infty} f(x,y)dy} THen we can find moments of that in the normal fashionIn other words, the marginal distributions f(y) and f(x) can be directly found from f(x,y), unless I am misinterpreting something.

So we have: <br /> E[f(Y|X=x)]=\int_{-\infty}^{\infty} y \frac{f(x,y)}{\int_{-\infty}^{\infty} f(x,y)dy} dy<br />
 
Last edited:

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 30 ·
2
Replies
30
Views
4K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 43 ·
2
Replies
43
Views
5K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
8
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K