So im a little confused about how to calculate the error uncertainty of a slope. Lets say I have data points (1,2), (2,2.75), (3,3.75), (4,4.7), (5,5.5) which when put in excel gives me a slope of .895. Lets say the error uncertainty for every point is +/-0.1. What I used to do is subtract the maximum possible slope of the first and last points and the minimum possible slope and divide it by 2 to find the error uncertainty. So the max slope would be ((5.5+.1)-(2-.1))/((5-.1)-(1+.1)) and then I would basically do the opposite to find the min slope. But recently I discovered that it is not an accurate way to find the uncertainty of the slope.(adsbygoogle = window.adsbygoogle || []).push({});

Does anybody else no how to find the error uncertainty of a slope?

**Physics Forums - The Fusion of Science and Community**

# Error Uncertainty of Slope

Have something to add?

- Similar discussions for: Error Uncertainty of Slope

Loading...

**Physics Forums - The Fusion of Science and Community**