Calculating Uncertainty in Gradient & Intercept of Line of Best Fit

  • Context: Graduate 
  • Thread starter Thread starter Identity
  • Start date Start date
  • Tags Tags
    Graph Uncertainty
Click For Summary
SUMMARY

This discussion focuses on calculating the uncertainty in the gradient and intercept of the line of best fit using two data points with known uncertainties in the y-direction. The equation derived for the line incorporates these uncertainties, leading to expressions for the gradient (m) and intercept (C) with their respective uncertainties (U(m) and U(C)). The participants confirm that for two data points, the best fit line is indeed the line connecting the two points, and the gradient is simply the slope of this line. References to least squares fitting and regression analysis are provided for further understanding.

PREREQUISITES
  • Understanding of linear regression concepts
  • Familiarity with uncertainty propagation in measurements
  • Knowledge of least squares fitting methodology
  • Basic algebra for solving equations
NEXT STEPS
  • Study the principles of "Weighted Least Squares" for handling uncertainties
  • Learn about "Regression Analysis" techniques for multiple data points
  • Explore "Uncertainty Propagation" methods in statistical analysis
  • Review the "Least Squares Fitting" algorithm in detail
USEFUL FOR

Researchers, data analysts, and statisticians involved in data fitting and uncertainty analysis will benefit from this discussion.

Identity
Messages
151
Reaction score
0
If you have several data points, each with a small uncertainty in the y-direction, and you want to find the uncertainty in the gradient and the uncertainty in the intercepts of the line of best fit, how would you go about doing that?


*I know with many points you would have to do something with regression, but could the simple, 2-data point case also be explained?

Here's what I'm thinking so far for the 2-data point case, can someone please tell me if I'm right:

Equation of the line, including uncertainties:

y -(y_0 \pm U(y_0)) = \frac{y_1 \pm U(y_1) - (y_0 \pm U(y_0))}{x_1 - x_0}(x - x_0)

So you would eventually get two separate "uncertainty" bits, one in the gradient and the other in the constant term.

y = (m \pm U(m))x + C \pm U(C)

Now do you just let 'y' or 'x' equal 0 and solve?


Thanks so much
 
Last edited:
Physics news on Phys.org
Wouldn't the best fit line for just two data points points, regardless of the uncertainties in the two points' y values, be a line through the two points themselves since their y values would necessarily be centered in the y error range? Therefore, wouldn't the gradient just be the slope of the line through the two points?
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
857
  • · Replies 7 ·
Replies
7
Views
25K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 10 ·
Replies
10
Views
7K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 11 ·
Replies
11
Views
5K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 20 ·
Replies
20
Views
2K