- #1
kingwinner
- 1,270
- 0
1) "Simple linear regression model: Yi = β0 + β1Xi + εi , i=1,...,n where n is the number of data points, εi is random error
We want to estimate β0 and β1 based on our observed data. The estimates of β0 and β1 are denoted by b0 and b1, respectively."
I don't understand the difference between β0,β1 and b0,b1.
For example, when we see a scattered plot with a least-square line of best fit, say, y = 8 + 5x, then βo=8, β1=5, right? What are the b0 and b1 all about? Why do we need to introduce b0,b1?
2) "Simple linear regression model: Yi = β0 + β1Xi + εi , i=1,...,n where n is the number of data points, εi is random error
Fitted value of Yi for each Xi is: Yi hat = b0 + b1Xi
Residual = vertical deviations = Yi - Yi hat = ei
where Yi is the actual observed value of Y, and Yi hat is the value of Y predicted by the model"
Now I don't understand the difference between random error (εi) and residual (ei). What is the meaning of εi? How are εi and ei different?
Thanks for explaining!
We want to estimate β0 and β1 based on our observed data. The estimates of β0 and β1 are denoted by b0 and b1, respectively."
I don't understand the difference between β0,β1 and b0,b1.
For example, when we see a scattered plot with a least-square line of best fit, say, y = 8 + 5x, then βo=8, β1=5, right? What are the b0 and b1 all about? Why do we need to introduce b0,b1?
2) "Simple linear regression model: Yi = β0 + β1Xi + εi , i=1,...,n where n is the number of data points, εi is random error
Fitted value of Yi for each Xi is: Yi hat = b0 + b1Xi
Residual = vertical deviations = Yi - Yi hat = ei
where Yi is the actual observed value of Y, and Yi hat is the value of Y predicted by the model"
Now I don't understand the difference between random error (εi) and residual (ei). What is the meaning of εi? How are εi and ei different?
Thanks for explaining!
Last edited: