Multivariate Distributions, Moments, and Correlations

In summary: If I understand you correctly, if you have a function V(x,t) and you know its moments, you can reconstruct the function V(x,t) by taking the inverse Fourier transform of the moments.
  • #1
shaiguy6
13
0
So if I start with a multivariate distribution f(x,y), I can find the marginal distributions, the conditional probability distributions, all conditional moments, and by the law of iterated expectations, the moments of both X and Y.

It seems to me that I should be able to relate the conditional moments in x to the conditional moments in y. Right? This is mainly coming from intuition. To be a bit more clear. If I have a function V(x,t) and it has all the properties of a joint probability distribution, I can begin to describe its shape by finding the conditional moments in X and in T. But it seems like all of the conditional moments in X should be able to recreate the original function V(x,t) just as well as all the conditional moments in T. I was wondering if this makes any sense at all. I'm not all too familiar with statistics, and feel like a huge dilletent :blushing:Relatedly, if i want to define a covariance or correlation between my two random variables, x and t, but I only know their joint distribution, V(x,t), then is the way to go about it to comput the first two moments of X and T using the law of iterated expectations, and then find the covariance and correlation that way?Sorry, one final thing. Having a probability distribution p(x) is equivalent to having the infinity of moments of that distribution. My question is, how can you rebuild the probability distribution given all the moments?

Sorry for my ramblings :)
Any help is appreciated.
 
Last edited:
Physics news on Phys.org
  • #2
shaiguy6 said:
But it seems like all of the conditional moments in X should be able to recreate the original function V(x,t) just as well as all the conditional moments in T. I was wondering if this makes any sense at all. I'm not all too familiar with statistics, and feel like a huge dilletent :blushing:Relatedly, if i want to define a covariance or correlation between my two random variables, x and t, but I only know their joint distribution, V(x,t), then is the way to go about it to comput the first two moments of X and T using the law of iterated expectations, and then find the covariance and correlation that way?Sorry, one final thing. Having a probability distribution p(x) is equivalent to having the infinity of moments of that distribution. My question is, how can you rebuild the probability distribution given all the moments?
I'm not exactly sure what you're after, but if you know f(x,y) and f(x) you can find the conditional expectation of Y from:

[tex] E(Y|X=x)=\int y \frac{f(x,y)}{f(x)} dy[/tex]

In general, for a random variable X conditional on an event B: E(X|B) is the sum or integral of the products of all possible values of the RV (or function) and the respective conditional probabilities of each.

EDIT: Note the nth moment is [tex] E(X-x)^n[/tex], n>1 and the covariance E[(X-x)(Y-y)].
 
Last edited:
  • #3
SW VandeCarr said:
I'm not exactly sure what you're after, but if you know f(x,y) and f(x) you can find the conditional expectation of Y from:

[tex] E(Y|X=x)=\int y \frac{f(x,y)}{f(x)} dy[/tex]

In general, for a random variable X conditional on an event B: E(X|B) is the sum or integral of the products of all possible values of the RV (or function) and the respective conditional probabilities of each.

EDIT: Note the nth moment is [tex] E(X-x)^n[/tex], n>1 and the covariance E[(X-x)(Y-y)].


yea thanks, so I've been messing around a bit, and I think I've almost got it figured out (just need to make everything pretty for my specific example). But in general, this is what I have:

Given the moments of either conditional distribution functions, you can recreate the characteristic function. Then, to get back the conditional distribution function, you take the inverse Fourier transform of the characteristic function. Since we can easily relate each of the conditional distribution functions to each other, we can then get the conditional distribution function of the other variable, and then we can get the moments from there. This is how the moments are related to each other. Going from either of the conditional distributions back to the joint distribution is straightfoward enough. If we keep everything in terms of the original moments, we will derive a relationship between the conditional moments and the joint moments (though in general it doesn't look pretty).
 
  • #4
shaiguy6 said:
Since we can easily relate each of the conditional distribution functions to each other, we can then get the conditional distribution function of the other variable, and then we can get the moments from there. This is how the moments are related to each other. Going from either of the conditional distributions back to the joint distribution is straightfoward enough. If we keep everything in terms of the original moments, we will derive a relationship between the conditional moments and the joint moments (though in general it doesn't look pretty).

If I understand you correctly, if you know f(x) and f(y), you can get f(x,y) using the characteristic function. In general, for any two of the above functions you can obtain the third. Moreover, knowing the functions allows you to obtain the moments. However,it sounds like you're trying to get the functions from knowing only the moments. I'm not sure this is true. Obtaining the expectations (the first moments) E(X|Y=y) or E(Y|X=x) analytically requires you know at least two of the three functions.
 
  • #5
SW VandeCarr said:
If I understand you correctly, if you know f(x) and f(y), you can get f(x,y) using the characteristic function. In general, for any two of the above functions you can obtain the third. Moreover, knowing the functions allows you to obtain the moments. However,it sounds like you're trying to get the functions from knowing only the moments. I'm not sure this is true. Obtaining the expectations (the first moments) E(X|Y=y) or E(Y|X=x) analytically requires you know at least two of the three functions.

i actually start with only the joint distribution, f(x,y). Then I am trying to find a relationship relating the moments of the conditional probability f(x|y) to the moments of the conditional probability f(y|x). I'm running into a slight issue taking the inverse Fourier transform of e^k... Other than that I think the algorithm above works perfect. Wait, finding the expectation of the conditionals only requires knowing f(x,y) I thought.

Because for instance if I have f(x,y) given, I can get the conditional by taking [tex]f(x|y)=\frac{f(x,y)}{\int_{-\infty}^{\infty} f(x,y)dy}[/tex] THen we can find moments of that in the normal fashionIn other words, the marginal distributions f(y) and f(x) can be directly found from f(x,y), unless I am misinterpreting something.

So we have: [tex]
E[f(Y|X=x)]=\int_{-\infty}^{\infty} y \frac{f(x,y)}{\int_{-\infty}^{\infty} f(x,y)dy} dy
[/tex]
 
Last edited:

1. What is a multivariate distribution?

A multivariate distribution is a probability distribution that involves more than one random variable. It describes the joint behavior of multiple variables and their relationship to each other.

2. What are moments in multivariate distributions?

Moments in multivariate distributions are numerical measures that summarize the properties of a distribution, such as its central tendency and spread. They provide information about the shape and location of the distribution.

3. How are correlations calculated in multivariate distributions?

Correlations in multivariate distributions are calculated using statistical methods, such as Pearson's correlation coefficient or Spearman's rank correlation coefficient. These methods measure the degree of linear relationship between two or more variables.

4. Can multivariate distributions be used to analyze relationships between more than two variables?

Yes, multivariate distributions can be used to analyze relationships between more than two variables. They can provide insights into the complex interactions between multiple variables and help identify important factors that influence a particular outcome.

5. What are some common applications of multivariate distributions in scientific research?

Multivariate distributions are commonly used in fields such as economics, psychology, and biology to analyze data and understand relationships between multiple variables. They are also used in machine learning and data mining to identify patterns and make predictions.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
14
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
917
  • Set Theory, Logic, Probability, Statistics
2
Replies
43
Views
4K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
17
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
Back
Top