Bivariate correlation does not always catch multicollinearity

  • Context: Undergrad 
  • Thread starter Thread starter fog37
  • Start date Start date
Click For Summary
SUMMARY

This discussion centers on the concept of multicollinearity in regression analysis, specifically how multiple predictors can exhibit low pairwise correlations while still being collectively correlated with the response variable. The example provided illustrates that even with pairwise correlations of approximately 0.2 among predictors \(X_1\), \(X_2\), and \(X_3\), the combined correlation \(r_{123}\) can exceed 0.7. The variance inflation factor (VIF) is highlighted as a crucial tool for assessing these complex relationships, emphasizing that visual representations like Venn diagrams may not adequately capture the nuances of multicollinearity.

PREREQUISITES
  • Understanding of multicollinearity in regression analysis
  • Familiarity with correlation coefficients and their interpretation
  • Knowledge of the variance inflation factor (VIF)
  • Basic grasp of regression equations and predictors
NEXT STEPS
  • Explore the calculation and interpretation of the variance inflation factor (VIF)
  • Study the implications of multicollinearity on regression coefficients
  • Learn about advanced regression techniques that mitigate multicollinearity
  • Investigate the use of principal component analysis (PCA) for dimensionality reduction
USEFUL FOR

Data analysts, statisticians, and researchers involved in regression modeling and predictive analytics will benefit from this discussion, particularly those seeking to understand the complexities of multicollinearity and its impact on model accuracy.

fog37
Messages
1,566
Reaction score
108
TL;DR
Bivariate correlation does not always catch multicollinearity
Hello,

While studying multicollinearity, I learned that if there are more than 2 predictors ##X##, for examples 3 predictors ##X_1, X_2, X_3##, it may be possible for all the possible pairwise correlations to be low in value but multicollinearity to still be an issue...That would mean that the "triple" correlation, i.e. the average of the products ##(X_1 X_2 X_3)##, would have a high value (higher than 0.7)...Is that correct?

Would you a have a simple example of how three variables may be correlated collectively even if their pairwise correlation is low?

Thank you!
 
Physics news on Phys.org
In a visual sense, using Venn diagrams, how can the predictors be correlated all together if they are not pairwise correlated at all? The figures below show moderate multicollinearity and strong multicollinearity. I don't see how the ##X## circles cannot overlap and still cause multicollinearity...

1704308542699.png
 
It may depend on how low you demand the individual pairwise correlations to be. Suppose that ##X_1## and ##X_2## are independent, identically distributed random variables and that ##Y = X_1+X_2##. Then I think it is clear that the correlation of ##Y## with any one ##X_i## may be smaller than the threshold even though ##Y## is a deterministic function of ##X_1, X_2##.
In fact, it gets easier when ##Y## is a function of more independent ##X_i## variables. Any one ##X_i## might have a low correlation with ##Y## but the combination of all the ##X_i##s might completely determine ##Y##. Suppose ##Y = X_1+X_2+...+X_{100}##, where the ##X_i##s are pairwise independent.
 
  • Like
Likes   Reactions: Office_Shredder and fog37
FactChecker said:
It may depend on how low you demand the individual pairwise correlations to be. Suppose that ##X_1## and ##X_2## are independent, identically distributed random variables and that ##Y = X_1+X_2##. Then I think it is clear that the correlation of ##Y## with any one ##X_i## may be smaller than the threshold even though ##Y## is a deterministic function of ##X_1, X_2##.
In fact, it gets easier when ##Y## is a function of more independent ##X_i## variables. Any one ##X_i## might have a low correlation with ##Y## but the combination of all the ##X_i## s might completely determine ##Y##. Suppose ##Y = X_1+X_2+...+X_{100}##, where the ##X_i## are pairwise independent.
Processing...multicollinearity is when the predictors are correlated in such a way that the estimated coefficient for a predictor, which would indicate the change in ##Y## per unit change in ##X##, is not what it is really is because ##X_1## and ##X_2## are correlated so when ##X_1## changes by one unit we cannot hold ##X_2## fixed and it changes too...

Let's say ##Y=b_1 X_1 + b_2 X_2 + b_3 X_3##...and the predictors ##Xs## are pairwise linearly independent with the correlation coefficients being low: ##r_{12} = r_{13} = r_{23} \approx 0.2##. That is not an automatic proof of lack of multicollinearity...

It could be ##r_{123} \approx 0.8##... But could that be? How can they collectively be more correlated than pairwise? I am struggling to see that, especially visually using the Venn diagram where each smaller circle represents the variance of ##X## and the larger circle is the variance of ##Y##...
 
Oh, maybe I get it now...It could be that ##Y=\beta_1 X_1 +\beta_2 X_2 + \beta_3 X_3## and the three regressors are pairwise uncorrelated to each other BUT the correlations between ##X_1## and, for example, the variable given by the sum ##X_2+X_3## to be nonzero and high in value. Same goes for the correlation btw ##X_2## and ##X_1+X_3##, etc.

I think that is what the variance inflation factor (VIF) does in checking these correlation combinations, which cannot be visualized with the Venn diagrams of the individual predictors and response variables, instead of focusing on the pairwise correlations...
 
fog37 said:
Oh, maybe I get it now...It could be that ##Y=\beta_1 X_1 +\beta_2 X_2 + \beta_3 X_3## and the three regressors are pairwise uncorrelated to each other BUT the correlations between ##X_1## and, for example, the variable given by the sum ##X_2+X_3## to be nonzero and high in value.
Not if the ##X_i##s are independent. Then ##X_1## would be uncorrelated to ##X_2+X_3##.

I probably should leave this for others since I am not an expert. But if ##Y = X_1+X_2##, where the ##X##s are independent, then ##Y, X_1, X_2## are all estimators of ##Y## to varying extents. ##X_1## and ##X_2## are independent. ##Y## is somewhat correlated to an individual ##X_i##, but completely determined by the pair. The more ##X_i##s there are in the sum, the weaker would be the correlation between ##Y## and the individual ##X_i##s.
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 22 ·
Replies
22
Views
4K
  • · Replies 21 ·
Replies
21
Views
4K