
#1
Jan2214, 01:13 PM

P: 16

So im a little confused about how to calculate the error uncertainty of a slope. Lets say I have data points (1,2), (2,2.75), (3,3.75), (4,4.7), (5,5.5) which when put in excel gives me a slope of .895. Lets say the error uncertainty for every point is +/0.1. What I used to do is subtract the maximum possible slope of the first and last points and the minimum possible slope and divide it by 2 to find the error uncertainty. So the max slope would be ((5.5+.1)(2.1))/((5.1)(1+.1)) and then I would basically do the opposite to find the min slope. But recently I discovered that it is not an accurate way to find the uncertainty of the slope.
Does anybody else no how to find the error uncertainty of a slope? 



#2
Jan2214, 01:56 PM

Mentor
P: 11,255

This appears to be what you're looking for:
https://www.che.udel.edu/pdf/FittingData.pdf Since your data points have uncertainties associated with them (more precisely, with their "yvalues"), scroll down to the section Weighted Least Squares Straight Line Fitting which begins on page 8. It might help to skim through the preceding pages first. 


Register to reply 
Related Discussions  
Uncertainty Propagation for the Slope of a Line of Best Fit  General Physics  7  
Help calculating uncertainty of slope & intercept of cal. curves. uncertainty ?s  Biology, Chemistry & Other Homework  2  
How do you count Uncertainty Measurement using Precision Error and Bias Error?  Engineering, Comp Sci, & Technology Homework  0  
Error bars and slope error ??  Introductory Physics Homework  7  
Error in slope with variance  Set Theory, Logic, Probability, Statistics  3 