So im a little confused about how to calculate the error uncertainty of a slope. Lets say I have data points (1,2), (2,2.75), (3,3.75), (4,4.7), (5,5.5) which when put in excel gives me a slope of .895. Lets say the error uncertainty for every point is +/-0.1. What I used to do is subtract the maximum possible slope of the first and last points and the minimum possible slope and divide it by 2 to find the error uncertainty. So the max slope would be ((5.5+.1)-(2-.1))/((5-.1)-(1+.1)) and then I would basically do the opposite to find the min slope. But recently I discovered that it is not an accurate way to find the uncertainty of the slope. Does anybody else no how to find the error uncertainty of a slope?
This appears to be what you're looking for: https://www.che.udel.edu/pdf/FittingData.pdf Since your data points have uncertainties associated with them (more precisely, with their "y-values"), scroll down to the section Weighted Least Squares Straight Line Fitting which begins on page 8. It might help to skim through the preceding pages first.