I am trying to work out the error in the gradient for a linear graph. I have worked out the the RMS error for the y-values, but since I am using excel to determine the graident of the graph I am a little unsure about how to work out the percentage error in my gradient from the RMS error. Am i right in thinking that I should work out the gradient by hand using points within my data range and the relative error will be the RMS error (for Y-values)/Change in Y? IS there a simpler way? I know the graph in theory should pass through (0,0) since the x values are the magnetic field strength, should I force the intercept through (0,0) or just include that point when drawing the line of best fit?