Multivariate Distributions, Moments, and Correlations

Click For Summary

Discussion Overview

The discussion revolves around the relationships between multivariate distributions, conditional moments, and the reconstruction of probability distributions from moments. Participants explore the implications of joint distributions, marginal distributions, and conditional expectations, primarily focusing on statistical properties and methodologies.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant suggests that conditional moments in X should be able to recreate the original function V(x,t) similarly to conditional moments in T, expressing uncertainty about this intuition.
  • Another participant proposes that to define covariance or correlation between random variables x and t using their joint distribution V(x,t), one can compute the first two moments of X and T via the law of iterated expectations.
  • A participant mentions that having a probability distribution p(x) is equivalent to having all moments of that distribution, questioning how to rebuild the probability distribution from these moments.
  • One participant indicates that given the moments of conditional distribution functions, one can recreate the characteristic function and subsequently the conditional distribution function through the inverse Fourier transform.
  • Another participant asserts that knowing f(x) and f(y) allows for the derivation of f(x,y) using the characteristic function, but expresses skepticism about deriving functions solely from moments.
  • A participant clarifies that starting with the joint distribution f(x,y) enables the calculation of conditional distributions and their moments, challenging the notion that expectations of conditionals require knowledge of more than the joint distribution.

Areas of Agreement / Disagreement

Participants express differing views on the relationships between conditional moments and joint distributions, with some uncertainty about the necessity of knowing multiple functions to derive others. The discussion remains unresolved regarding the ability to reconstruct probability distributions solely from moments.

Contextual Notes

Participants highlight potential issues with taking inverse Fourier transforms and the conditions under which expectations can be derived, indicating a reliance on specific properties of the distributions involved.

shaiguy6
Messages
13
Reaction score
0
So if I start with a multivariate distribution f(x,y), I can find the marginal distributions, the conditional probability distributions, all conditional moments, and by the law of iterated expectations, the moments of both X and Y.

It seems to me that I should be able to relate the conditional moments in x to the conditional moments in y. Right? This is mainly coming from intuition. To be a bit more clear. If I have a function V(x,t) and it has all the properties of a joint probability distribution, I can begin to describe its shape by finding the conditional moments in X and in T. But it seems like all of the conditional moments in X should be able to recreate the original function V(x,t) just as well as all the conditional moments in T. I was wondering if this makes any sense at all. I'm not all too familiar with statistics, and feel like a huge dilletent :blushing:Relatedly, if i want to define a covariance or correlation between my two random variables, x and t, but I only know their joint distribution, V(x,t), then is the way to go about it to comput the first two moments of X and T using the law of iterated expectations, and then find the covariance and correlation that way?Sorry, one final thing. Having a probability distribution p(x) is equivalent to having the infinity of moments of that distribution. My question is, how can you rebuild the probability distribution given all the moments?

Sorry for my ramblings :)
Any help is appreciated.
 
Last edited:
Physics news on Phys.org
shaiguy6 said:
But it seems like all of the conditional moments in X should be able to recreate the original function V(x,t) just as well as all the conditional moments in T. I was wondering if this makes any sense at all. I'm not all too familiar with statistics, and feel like a huge dilletent :blushing:Relatedly, if i want to define a covariance or correlation between my two random variables, x and t, but I only know their joint distribution, V(x,t), then is the way to go about it to comput the first two moments of X and T using the law of iterated expectations, and then find the covariance and correlation that way?Sorry, one final thing. Having a probability distribution p(x) is equivalent to having the infinity of moments of that distribution. My question is, how can you rebuild the probability distribution given all the moments?
I'm not exactly sure what you're after, but if you know f(x,y) and f(x) you can find the conditional expectation of Y from:

E(Y|X=x)=\int y \frac{f(x,y)}{f(x)} dy

In general, for a random variable X conditional on an event B: E(X|B) is the sum or integral of the products of all possible values of the RV (or function) and the respective conditional probabilities of each.

EDIT: Note the nth moment is E(X-x)^n, n>1 and the covariance E[(X-x)(Y-y)].
 
Last edited:
SW VandeCarr said:
I'm not exactly sure what you're after, but if you know f(x,y) and f(x) you can find the conditional expectation of Y from:

E(Y|X=x)=\int y \frac{f(x,y)}{f(x)} dy

In general, for a random variable X conditional on an event B: E(X|B) is the sum or integral of the products of all possible values of the RV (or function) and the respective conditional probabilities of each.

EDIT: Note the nth moment is E(X-x)^n, n>1 and the covariance E[(X-x)(Y-y)].


yea thanks, so I've been messing around a bit, and I think I've almost got it figured out (just need to make everything pretty for my specific example). But in general, this is what I have:

Given the moments of either conditional distribution functions, you can recreate the characteristic function. Then, to get back the conditional distribution function, you take the inverse Fourier transform of the characteristic function. Since we can easily relate each of the conditional distribution functions to each other, we can then get the conditional distribution function of the other variable, and then we can get the moments from there. This is how the moments are related to each other. Going from either of the conditional distributions back to the joint distribution is straightfoward enough. If we keep everything in terms of the original moments, we will derive a relationship between the conditional moments and the joint moments (though in general it doesn't look pretty).
 
shaiguy6 said:
Since we can easily relate each of the conditional distribution functions to each other, we can then get the conditional distribution function of the other variable, and then we can get the moments from there. This is how the moments are related to each other. Going from either of the conditional distributions back to the joint distribution is straightfoward enough. If we keep everything in terms of the original moments, we will derive a relationship between the conditional moments and the joint moments (though in general it doesn't look pretty).

If I understand you correctly, if you know f(x) and f(y), you can get f(x,y) using the characteristic function. In general, for any two of the above functions you can obtain the third. Moreover, knowing the functions allows you to obtain the moments. However,it sounds like you're trying to get the functions from knowing only the moments. I'm not sure this is true. Obtaining the expectations (the first moments) E(X|Y=y) or E(Y|X=x) analytically requires you know at least two of the three functions.
 
SW VandeCarr said:
If I understand you correctly, if you know f(x) and f(y), you can get f(x,y) using the characteristic function. In general, for any two of the above functions you can obtain the third. Moreover, knowing the functions allows you to obtain the moments. However,it sounds like you're trying to get the functions from knowing only the moments. I'm not sure this is true. Obtaining the expectations (the first moments) E(X|Y=y) or E(Y|X=x) analytically requires you know at least two of the three functions.

i actually start with only the joint distribution, f(x,y). Then I am trying to find a relationship relating the moments of the conditional probability f(x|y) to the moments of the conditional probability f(y|x). I'm running into a slight issue taking the inverse Fourier transform of e^k... Other than that I think the algorithm above works perfect. Wait, finding the expectation of the conditionals only requires knowing f(x,y) I thought.

Because for instance if I have f(x,y) given, I can get the conditional by taking f(x|y)=\frac{f(x,y)}{\int_{-\infty}^{\infty} f(x,y)dy} THen we can find moments of that in the normal fashionIn other words, the marginal distributions f(y) and f(x) can be directly found from f(x,y), unless I am misinterpreting something.

So we have: <br /> E[f(Y|X=x)]=\int_{-\infty}^{\infty} y \frac{f(x,y)}{\int_{-\infty}^{\infty} f(x,y)dy} dy<br />
 
Last edited:

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 30 ·
2
Replies
30
Views
5K
  • · Replies 14 ·
Replies
14
Views
2K
  • · Replies 43 ·
2
Replies
43
Views
6K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
8
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K