1)(adsbygoogle = window.adsbygoogle || []).push({}); "Simple linear regression model: Y_{i}= β_{0}+ β_{1}X_{i}+ ε_{i}, i=1,...,n where n is the number of data points, ε_{i}is random error

We want to estimate β_{0}and β_{1}based on our observed data. The estimates of β_{0}and β_{1}are denoted by b_{0}and b_{1}, respectively."

I don't understand the difference between β_{0},β_{1}and b_{0},b_{1}.

For example, when we see a scattered plot with a least-square line of best fit, say, y = 8 + 5x, then βo=8, β1=5, right? What are the b_{0}and b_{1}all about? Why do we need to introduce b_{0},b_{1}?

2)"Simple linear regression model: Y_{i}= β_{0}+ β_{1}X_{i}+ ε_{i}, i=1,...,n where n is the number of data points, ε_{i}is random error

Fitted value of Y_{i}for each X_{i}is: Y_{i}hat = b_{0}+ b_{1}X_{i}

Residual = vertical deviations = Y_{i}- Y_{i}hat = e_{i}

where Y_{i}is the actual observed value of Y, and Y_{i}hat is the value of Y predicted by the model"

Now I don't understand the difference between random error (ε_{i}) and residual (e_{i}). What is the meaning of ε_{i}? How are ε_{i}and e_{i}different?

Thanks for explaining!

**Physics Forums - The Fusion of Science and Community**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Linear Regression Models (2)

Loading...

Similar Threads - Linear Regression Models | Date |
---|---|

A Logistic Regression Interpretation | Mar 17, 2017 |

Regression Analysis of Tidal Phases | Nov 19, 2014 |

Propagating Measurement Uncertainty into a Linear Regression Model | Jan 17, 2010 |

Linear Regression Models (3) | May 14, 2009 |

Linear Regression Models (1) | May 14, 2009 |

**Physics Forums - The Fusion of Science and Community**