Expected coefficient change from simple to multiple linear regression

In summary: Structural collinearity (a term I have not come across before) refers to a situation where the correlation among the explanatory variables arises from a fundamental relationship between the variables themselves, rather than from the variables being caused by some other variable. Structural collinearity can be a problem because it can lead to spurious correlations that are not actually due to the explanatory variables.Sure, sorry, I used "cause" inadvertently. But just the fact that ##X## and ##X^2## are deterministically dependent terms would make me think that structural collinearity would emerge from them...
  • #1
fog37
1,568
108
TL;DR Summary
understand the expected coefficient change (magnitude and sign) from simple to multiple linear regression
Hello forum,

I have created some linear regression models based on a simple dataset with 4 variables (columns). The first models simply involve one predictor variable: $$Y=\beta_1 X_1+\beta_0$$ and $$Y=\beta_2 X_2+ \beta_0$$
The 3rd model is multiple linear regression model involving the 3 predictors: $$Y= \beta_3 X_3 + \beta_2 X_2 + \beta_1 X_1 + \beta_0$$
I believe that the coefficient ##\beta_1## or ##\beta_2## for the predictors ##X_1## and ##X_2## change in magnitude when the two predictors are included together in the multivariate model (model 3), correct? What about the sign of those coefficients? Should the sign stay the same or can it possibly change?

I would think that the sign should remain the same to indicate that the variable ##Y## and ##X_1## (or ##X_2##) vary in the same direction in both the simple and multiple linear regression models...

Now, if multicollinearity is present, the coefficients for each predictor would certainly change in magnitude and sign from the coefficients in the simple linear regression model but not in the correct way due to the inter-variable correlation...

Thanks
 
Physics news on Phys.org
  • #2
I agree with you but with a couple of caveats:
  1. In real-world models, multicollinearity (correlation between the explanatory variables ##X_1,X_2,X_3##) is usually present, which undermines the expectation stated in your second last para.
  2. Even without genuine multicollinearity, random idiosyncratic variation in the sample can make an appearance of multicollinearity, in which case we can still get sign changes of coefficients. This will not usually happen, but it will sometimes. The larger the data set, the less often it will happen.
 
  • Like
Likes fog37
  • #3
andrewkirk said:
I agree with you but with a couple of caveats:
  1. In real-world models, multicollinearity (correlation between the explanatory variables ##X_1,X_2,X_3##) is usually present, which undermines the expectation stated in your second last para.
  2. Even without genuine multicollinearity, random idiosyncratic variation in the sample can make an appearance of multicollinearity, in which case we can still get sign changes of coefficients. This will not usually happen, but it will sometimes. The larger the data set, the less often it will happen.
Thanks for the quick and interesting reply. I am indeed surprise to learn that even, without any multicollinearity, a change in coefficient sign may be possible when the same variables of interest are present in both a simple and in a multiple regression model...

In regards to multicollinearity, my understanding is that it affects the coefficients' values in strange ways. I recently learned that, in the case of a model with a term ##X## and a quadratic term ##X^2##, like $$Y=\beta_1+\beta_2 X^2$$ it seems that multicollinearity would not be a problem if ##X## and ##X^2## are dependent (even if not linearly dependent). Isn't the fact that one variable changing causes an change in the other variable the prime definition of multicollinearity?
 
  • #4
fog37 said:
Isn't the fact that one variable changing causes an change in the other variable the prime definition of multicollinearity?
No, they just have to be correlated. Causation is not part of the definition (eg see here). A common situation is where the correlation arises from each of the explanatory variables being driven ("caused") by another variable that may not be part of the set of explanatory variables. eg in a regression that had population crime levels and sickness levels as explanatory variables, we would likely find that those two are correlated because driven by a third variable of average population wealth, which may not be in the explanatory variables.
 
  • Like
Likes fog37
  • #5
andrewkirk said:
No, they just have to be correlated. Causation is not part of the definition (eg see here). A common situation is where the correlation arises from each of the explanatory variables being driven ("caused") by another variable that may not be part of the set of explanatory variables. eg in a regression that had population crime levels and sickness levels as explanatory variables, we would likely find that those two are correlated because driven by a third variable of average population wealth, which may not be in the explanatory variables.
Sure, sorry, I used "cause" inadvertently. But just the fact that ##X## and ##X^2## are deterministically dependent terms would make me think that structural collinearity would emerge from them...
 

What is the difference between simple and multiple linear regression?

Simple linear regression involves predicting a single outcome variable based on a single predictor variable. Multiple linear regression involves predicting a single outcome variable based on multiple predictor variables.

What is the expected coefficient change from simple to multiple linear regression?

The expected coefficient change from simple to multiple linear regression is the change in the regression coefficients of the predictor variables when additional variables are added to the model. This change can be positive, negative, or zero.

Why is the expected coefficient change important in multiple linear regression?

The expected coefficient change is important because it helps us understand how the relationship between the outcome variable and each predictor variable changes when other variables are included in the model. It can also indicate which variables are most influential in predicting the outcome variable.

How is the expected coefficient change calculated?

The expected coefficient change is calculated by taking the difference between the regression coefficients of the predictor variable in the simple linear regression model and the multiple linear regression model.

What factors can affect the expected coefficient change in multiple linear regression?

The expected coefficient change can be affected by the correlation between predictor variables, the sample size, and the strength of the relationships between the outcome variable and each predictor variable. It can also be influenced by the inclusion or exclusion of certain variables in the model.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
13
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
841
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
23
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
13
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
486
  • Set Theory, Logic, Probability, Statistics
Replies
19
Views
2K
Back
Top