1. The problem statement, all variables and given/known data I'm being asked to compare the standard deviation of a data set with the root mean square error of the regression line used to model the data, in order to determine the reliability of the regression line. 2. Relevant equations Mean squared error = variance + bias squared 3. The attempt at a solution I've done some googling and found that the MSE equals the variance + bias squared. So if I'm understanding this correctly, if the regression line is reliable, the bias should be low and hence the RMSE should be approximately equal to the SD? My main hang up with this is that if the data just happens to fall upon a straight line, it can have a non-zero standard deviation, but the line used to model the data will describe it perfectly and hence have a RMSE of zero. My head is really spinning at this..