IID and Dependent RVs: A Closer Look at their Relationship and Parameters

  • Context: Undergrad 
  • Thread starter Thread starter member 428835
  • Start date Start date
Click For Summary
SUMMARY

The discussion focuses on the relationship between independent and dependent random variables (RVs) using the example of two dependent RVs, X1 and X2, defined as X1=μ1+σ1ε1 and X2=μ2+ρε1+σ2ε2, where ε1 and ε2 are independent and identically distributed (iid) normal random variables N(0,1). The parameters μ, σ, and ρ represent the means and standard deviations of the respective RVs. The key conclusion is that X1 and X2 are correlated due to their shared dependence on ε1, rather than one being a function of the other.

PREREQUISITES
  • Understanding of random variables and their distributions, specifically normal distributions.
  • Familiarity with the concepts of mean (μ) and standard deviation (σ) in statistics.
  • Knowledge of correlation and independence in probability theory.
  • Basic understanding of time series analysis and Moving Average models.
NEXT STEPS
  • Study the properties of dependent and independent random variables in probability theory.
  • Learn about the implications of correlation in statistical analysis.
  • Explore Moving Average models in time series analysis for practical applications.
  • Investigate the role of covariance in understanding relationships between random variables.
USEFUL FOR

Statisticians, data scientists, and anyone interested in understanding the relationships between random variables and their implications in statistical modeling.

member 428835
If ##\epsilon_1,\epsilon_2## are iid ##N(0,1)##, ##X_1=\mu_1+\sigma_1 \epsilon_1## and ##X_2=\mu_2+\rho\epsilon_1+\sigma_2 \epsilon_2## are evidently a pair of dependent RVs that are not identically distributed for most values of the parameters. I have no idea what ##\mu,\sigma,\rho## are. I assume ##\mu## is mean and ##\sigma## is standard deviation? I read this example here.
 
Physics news on Phys.org
joshmccraney said:
If ##\epsilon_1,\epsilon_2## are iid ##N(0,1)##, ##X_1=\mu_1+\sigma_1 \epsilon_1## and ##X_2=\mu_2+\rho\epsilon_1+\sigma_2 \epsilon_2## are evidently a pair of dependent RVs that are not identically distributed for most values of the parameters. I have no idea what ##\mu,\sigma,\rho## are. I assume ##\mu## is mean and ##\sigma## is standard deviation? I read this example here.
The example in the link does not say exactly what they are, but we can make the reasonable assumption that he is using the very common notations, where all of the ##\mu##s, ##\epsilon##s, and ##\sigma##s are real number constants EDIT: with ##\epsilon##s, and ##\sigma##s positive. In that case, ##\mu_1## is mean and ##\sigma_1## is standard deviation of ##X_1##. Also, ##\mu_2## is mean of ##X_2##. The standard deviation of ##X_2## is more complicated. The SD of the individual terms is ##\rho## and ##\sigma_2##, but the sum of those has a variance which is the sum of the variances
 
Last edited:
FactChecker said:
The example in the link does not say exactly what they are, but we can make the reasonable assumption that he is using the very common notations, where all of the ##\mu##s, ##\epsilon##s, and ##\sigma##s are real number constants. In that case, ##\mu_1## is mean and ##\sigma_1## is standard deviation of ##X_1##. Also, ##\mu_2## is mean of ##X_2##. The standard deviation of ##X_2## is more complicated. The SD of the individual terms is ##\rho## and ##\sigma_2##, but the sum of those has a variance which is the sum of the variances
Okay, thanks. So I'm missing the crux: why are these dependent instead of independent? It seems to be because ##\epsilon_1## is a function of ##X_1##, and so ##X_2## implicitly depends on ##X_1##?
 
No. ##\epsilon_1## is not a function of ##X_1##. It is the other way around.
It is not that ##X_2## depends on ##X_1##. It is better to understand that they both depend on ##\epsilon_1##, so their tendencies are related. That makes them correlated and not independent.

(PS. I don't like to say that ##X_1## and ##X_2## are dependent until you are comfortable with what that means in probability. It just means that the tendencies of one give a hint to the tendencies of the other. It does not mean the functional dependency that you are probably used to.)
 
  • Like
Likes   Reactions: member 428835
FactChecker said:
No. ##\epsilon_1## is not a function of ##X_1##. It is the other way around.
It is not that ##X_2## depends on ##X_1##. It is better to understand that they both depend on ##\epsilon_1##, so their tendencies are related. That makes them correlated and not independent.

(PS. I don't like to say that ##X_1## and ##X_2## are dependent until you are comfortable with what that means in probability. It just means that the tendencies of one give a hint to the tendencies of the other. It does not mean the functional dependency that you are probably used to.)
Perfect explanation, thanks so much!
 
Consider human arm length and leg length. Clearly, they are related and not independent, yet there are many other factors involved and one does not cause the other. It is just that their tendencies are related.
 
  • Like
Likes   Reactions: member 428835
For one concrete example, one can see a similar form in Moving Average models in time series analysis

1650297202304.png


https://en.wikipedia.org/wiki/Moving-average_model
 
  • Like
Likes   Reactions: FactChecker

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 18 ·
Replies
18
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 14 ·
Replies
14
Views
14K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 150 ·
6
Replies
150
Views
20K
  • · Replies 4 ·
Replies
4
Views
4K