Uncorrelated Vs. Independent variables

Click For Summary

Discussion Overview

The discussion revolves around the concepts of uncorrelated and independent variables, particularly in the context of regression analysis and normal distributions. Participants explore the definitions, implications, and examples of these terms, seeking clarity on their relationships and potential misconceptions.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested

Main Points Raised

  • Some participants assert that for normally distributed variables, zero correlation implies independence, while others challenge this by suggesting that uncorrelated normal variables may not be independent unless they are jointly normal.
  • A participant provides a counterexample involving a standard normal variable and a dependent variable constructed from it, illustrating that uncorrelated variables can still be dependent.
  • Another participant questions whether height and weight can be considered correlated but independent, suggesting that they are likely dependent due to the nature of the measurements.
  • There is a discussion about the definitions of independence and joint distribution, with some participants clarifying that joint distribution does not necessarily imply independence.
  • One participant expresses confusion over the definitions and seeks simpler examples to understand the concepts better.

Areas of Agreement / Disagreement

Participants express differing views on the relationship between correlation and independence, particularly in the context of normal distributions. There is no consensus on whether uncorrelated variables can be independent without the joint normality condition. The discussion remains unresolved regarding the examples provided, with multiple competing views on the nature of the relationships between variables.

Contextual Notes

Some participants note the importance of joint normality in the context of independence and correlation, indicating that the definitions and examples may depend on specific assumptions that are not universally agreed upon.

musicgold
Messages
303
Reaction score
19
Hi,

I am confused with respect to these two terms. In a book on regression analysis, I read the following statements.

1. For two normally distributed variables, zero covariance / correlation means independence of the two variables.

2. With the normality assumption, the following equation means that [tex]\mu_i[/tex] and [tex]\mu_j[/tex] are NOT ONLY uncorrelated BUT ALSO independently distributed.


[tex]\left \mu_i - N (0, \sigma^2 \right)[/tex]

Not able to get the wiggly line (~) after ui

I am trying to understand if it is possible to have two variables that are
(a) uncorrelated, and not-independent.
(b) uncorrelated and independent
(c) correlated and not-independent
(d) correlated and independent

I would appreciate it if you could explain each type with one example.

Thanks

MG.
 
Last edited:
Physics news on Phys.org
If the variables are normally distributed, then correlation is zero if and only if they are independent. (By the way, instead of not-independent you should say dependent .

In general, if [tex]X, Y[/tex] are independent, their correlation is zero, since

[tex] E[(X-\mu_X)(Y-\mu_Y)] = E[X-\mu_X] \cdot E[Y - \mu_Y] = 0[/tex]

so the correlation will be zero.

For uncorrelated but dependent, consider this somewhat classic example. Assume [tex]X[/tex] has a standard normal distribution, let [tex]W[/tex] be independent of [tex]X[/tex] and [tex]P(W=1) = 1/2 = P(W = -1)[/tex]. Set

[tex]Y = W X [/tex]

With a little work you can find that

a) [tex]Y[/tex] and [tex]X[/tex] are not correlated

b) [tex]Y[/tex] has a standard normal distribution (calculate [tex]P(Y \le y) = E[P(Y \le y \mid W)] =E[P(X \le y \mid W)][/tex], and use both the definition of W and the fact that W, X are independent


For correlated and dependent - look at any multivariate normal distribution with non-zero correlations.

Correlated and independent. Let [tex]X[/tex] be uniformly distributed on [tex][-1, 1][/tex] and let [tex]Y = X^2[/tex].

These two variables are not independent, since [tex]Y[/tex] is determined by [tex]X[/tex], but they are uncorrelated.
c) [tex]X[/tex] and [tex]Y[/tex] are dependent.
 
Summary ... (d) is impossible. If X and Y are independent, then X and Y are uncorrelated.

The other three are all possible.

However, when the RVs are normal, (a) is also impossible. For normal random variables X and Y, we have: X and Y are independent if and only if X and Y are uncorrelated.
 
statdad and g_edgar,

Thanks.

I thought the term 'independent' here was the opposite of 'joint', as in 'jointly distributed'.

Also, in terms of examples, I was looking for more simple explanations. For example, can we say
the Height and Weight variables for a certain population are correlated but independent?

I found some discussion at the end of http://www.ccl.rutgers.edu/~ssi/thesis/thesis-node53.html" web page, but it is not very clear to me.

Thanks,

MG.
 
Last edited by a moderator:
musicgold said:
For example, can we say
the Height and Weight variables for a certain population are correlated but independent?

I would not expect them to be independent, since taller people tend to weigh more than shorter people.
 
Some return comments.

musicgold said:
statdad and g_edgar,

Thanks.

I thought the term 'independent' here was the opposite of 'joint', as in 'jointly distributed'.

No, variables that are jointly distributed may or may not be independent.
Also, in terms of examples, I was looking for more simple explanations. For example, can we say
the Height and Weight variables for a certain population are correlated but independent?
No - if you take look at a group of people, and measure (say) each person's height and weight, those measured variables will be correlated - as another says, taller people tend to weigh more, but the more central point is that the measurements are taken from the same person.
I found some discussion at the end of http://www.ccl.rutgers.edu/~ssi/thesis/thesis-node53.html" web page, but it is not very clear to me.

Those are good notes, but seem to be (may be - I'm not sure of your mathematical background) more advanced than your current investigations.
 
Last edited by a moderator:
musicgold said:
I am confused with respect to these two terms. In a book on regression analysis, I read the following statements.

1. For two normally distributed variables, zero covariance / correlation means independence of the two variables.

No, that's not right. It is not necessary for two uncorrelated and normal variables to be independent. I added a counterexample myself to planetmath website a while ago, http://planetmath.org/encyclopedia/SumsOfNormalRandomVariablesNeedNotBeNormal.html" . Are you sure that your book doesn't add an extra requirement that they are "joint normal"? (which is more than just saying that they are normal.)

Edit: I see statdad's example also showed this, but his post started with "If the variables are normally distributed, then correlation is zero if and only if they are independent." which is wrong, unless by 'normally distributed' he meant 'joint normal'.
 
Last edited by a moderator:
Statdad and gel,

Thanks a lot. I think I will do more reading on this topic and come back with my questions, if any.

MG.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 43 ·
2
Replies
43
Views
6K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 13 ·
Replies
13
Views
5K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 30 ·
2
Replies
30
Views
5K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K