Product of correlated random variables

Click For Summary
SUMMARY

The discussion centers on the computation of the joint probability of correlated random variables, specifically P(X1, X2, ..., Xn). It establishes that while individual probabilities P(X1), P(X2), ..., P(Xn) and their covariance can be calculated, these do not provide sufficient information to derive the joint probability P(X1=x1, X2=x2, ..., Xn=xn). The correlation coefficients yield n² values, while the complete joint distribution requires mⁿ values, indicating a significant difference in the degrees of freedom between the two approaches.

PREREQUISITES
  • Understanding of probability theory, specifically joint and marginal probabilities.
  • Familiarity with covariance and correlation coefficients in statistics.
  • Knowledge of discrete random variables and their distributions.
  • Basic comprehension of combinatorial mathematics related to probabilities.
NEXT STEPS
  • Study the concept of joint probability distributions in detail.
  • Learn about the implications of covariance and correlation in multivariate statistics.
  • Explore the differences between marginal and conditional probabilities.
  • Investigate the use of graphical models to represent joint distributions of correlated variables.
USEFUL FOR

Statisticians, data scientists, and researchers working with correlated random variables, as well as students studying advanced probability theory.

benjaminmar8
Messages
10
Reaction score
0
Hi, All,

Let x1 x2... Xn be correlated random events (or variables). Say P(X1), P(X2)..., P(Xn) can be computed, in addition to that, covariance and correlated between all X can be computed. My question is, what is P(X1) * P(X2) *... * P(Xn)?
 
Physics news on Phys.org
Well, if P(X1=x1)...P(Xn=xn) can be computed, then obviously P(X1=x1) * ... * P(Xn=xn) can also be computed as the product of those. Perhaps you meant to ask, "what is P(X1=x1, X2=x2, ..., Xn=xn)?" From just the correlation coefficients, you don't have enough information to compute that. I could give you a specific example of a situation where P(X1=x1,X2=x2) can't be computed from your given information, but it is more intuitive to note that the correlation coefficients give you n^2 numbers, and if you know every P(X_i=x_j) that gives you mn numbers where m is the number of discrete values each variable may take, but knowing every P(X1=x_j1,...,Xn=x_jn) for each sequence of indices j1 ... jn, involves knowing m^n numbers, potentially a much larger number than n^2 + mn. So the full joint distribution usually has a lot more "degrees of freedom" than the correlation matrix + the individual distributions, so specifying the latter can't tell you everything about the former. That's not exactly a proof, but it should help give an intuitive idea.
 

Similar threads

Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 30 ·
2
Replies
30
Views
5K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
6K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K