- #1
xeon123
- 90
- 0
Hi,
I am trying to understand how I find the error in linear regression, and what to do with it. I am using linear regression to predict the time of execution based on the size of the input and the number of tasks used in the computer to get the result.
1 - In a linear regression, I calculate the error finding the difference between the regression line and all the points [itex](y-\hat{y})[/itex] in the scatter points. The Mean squared error is a way to get the error? Can I use it to predict again?Thanks.
I am trying to understand how I find the error in linear regression, and what to do with it. I am using linear regression to predict the time of execution based on the size of the input and the number of tasks used in the computer to get the result.
1 - In a linear regression, I calculate the error finding the difference between the regression line and all the points [itex](y-\hat{y})[/itex] in the scatter points. The Mean squared error is a way to get the error? Can I use it to predict again?Thanks.
Last edited: