Undergrad Significant correlation, not significant coefficient

Click For Summary
The discussion centers on discrepancies between correlation and regression results in a panel data analysis, specifically regarding the variable "Anleggsmidler/eiendeler," which shows a significant correlation but is not significant in regression. This inconsistency is attributed to multicollinearity among the independent variables, which can lead to unreliable coefficient estimates. A VIF test indicates acceptable levels of multicollinearity, but it is suggested that removing "Anleggsmidler/eiendeler" from the model may yield clearer results. Additionally, there is a query about including standardized coefficients in fixed effects regression in Stata, with advice to consider a reduced model. The conversation highlights the importance of addressing multicollinearity for accurate regression analysis.
monsmatglad
Messages
75
Reaction score
0
A couple of questions today. First. I am running a panel data regression test. First I check the correlations between the independent variables and the dependent variable. these are the results.
upload_2017-4-30_17-32-50.png


The D/(D+Em) is the dependent variable, and the independent are the 4 variables most adjacent. Disregard the two outer variables (the red area). The independent variable "Anleggsmidler/eiendeler" has a correlation coefficient (pearson's) to the dependent variable of ,177 and this is a very significant result as you can see. However, when I do the regression analysis (as shown below), the relation between "Anleggsmidler/eiendeler" and the dependent variable is not significant at all. How come the results are so different in terms of significance?

upload_2017-4-30_17-38-18.png

Second question is is there any command to include standardised coefficients to fixed effects regression results in stata? the beta command does not work when I use fixed effects regression (i am pretty green when it comes to Stata). Any advice on these two questions? Thanks in advance!
 
Physics news on Phys.org
You have pretty substantial multicolinearity in this data. The estimates of individual coefficients will be unreliable in the regression.
 
hey. I used a VIF-test and it produced the following numbers: I understand that it differs what is regarded as acceptable, but as far as I know, below 10 is usually no disaster (based on google searches, not on my acumen).

Variable VIF 1/VIF
lndriftsin~r 5.72 0.174921
anleggsmid~r 5.02 0.199399
totalrenta~t 1.34 0.743779
markedbok 1.18 0.845347
Mean VIF 3.31
 
By the time the (estimated) effects of the other variables are taken into account, there is not enough left for Anleggsmidler/eiendeler to be statistically significant. It should be removed from the model and the linear regression should be re-run without it.
 
monsmatglad said:
I used a VIF-test
Instead, just try running a reduced model deleting totalrentabilitat
 
The standard _A " operator" maps a Null Hypothesis Ho into a decision set { Do not reject:=1 and reject :=0}. In this sense ( HA)_A , makes no sense. Since H0, HA aren't exhaustive, can we find an alternative operator, _A' , so that ( H_A)_A' makes sense? Isn't Pearson Neyman related to this? Hope I'm making sense. Edit: I was motivated by a superficial similarity of the idea with double transposition of matrices M, with ## (M^{T})^{T}=M##, and just wanted to see if it made sense to talk...

Similar threads

  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K