Finding the Uncertainty of the Slope Parameter of a Liner Regression

Click For Summary

Discussion Overview

The discussion revolves around the challenge of calculating the uncertainty of the slope parameter in linear regression when both x and y measurements have associated uncertainties. Participants explore different approaches to this problem, including the implications of measurement errors and the use of various regression techniques.

Discussion Character

  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant seeks a straightforward method to compute the uncertainty of the slope parameter b in the context of linear regression with measurement uncertainties.
  • Another participant questions the use of "uncertainty" and points out the difficulty in estimating the standard deviation of b from a single sample.
  • A participant suggests using a linearized asymptotic estimate and discusses the relationship between the standard deviation of b and the standard deviations of the x and y measurements.
  • One participant provides a formula for b based on N observation pairs and proposes applying the propagation of error formula to estimate the uncertainty.
  • Another participant introduces the concept of "total least squares" regression, noting that it differs from ordinary least squares regression, especially when both x and y have measurement errors.
  • There is a discussion about whether the ordinary least squares estimator for slope remains unbiased in the presence of errors in the x measurements.

Areas of Agreement / Disagreement

Participants express differing views on the appropriate methods for calculating uncertainty in the slope parameter, with no consensus reached on a single approach. The discussion highlights the complexity of the problem and the various factors that influence the estimation of uncertainty.

Contextual Notes

Participants note the limitations of ordinary least squares regression in the presence of measurement errors in x, and the need for additional terms in the uncertainty calculations. The discussion reflects the dependence on specific definitions and assumptions regarding measurement uncertainties.

richardc
Messages
7
Reaction score
1
Finding the Uncertainty of the Slope Parameter of a Linear Regression

Suppose I have measurements x_i \pm \sigma_{xi} and y_i \pm \sigma_{yi} where \sigma is the uncertainty in the measurement. If I use a linear regression to estimate the value of b in y=a+bx, I'm struggling to find a straightforward way to compute the uncertainty of b that arises from the measurement uncertainties. This seems like it should be a very common problem, so I'm not sure why I can't find a simple algorithm or formula.

Thank you for any advice.
 
Last edited:
Physics news on Phys.org
Are you using "uncertainty" to mean "standard deviation"?

It's a common problem, but it's not simple. After all, your data gives only one value for b, so how can you estimate the standard deviation of b from a sample of size 1 ?

The common way to get an answer is to oversimplify matters and compute a "linearized asymptotic" estimate. The value of b is some function F of the (x_i,y_i). Let L be the linear approximation for the function F. Assume that near the observed values in the sample that this well approximates the random variable b as a linear combination of the x_i and y_i. When you have a random variable expressed as linear combination of other random variables, you can work on expressing its standard deviation in terms of the standard deviations of the other random variables.

That's the general picture. If it's what you want to do then we can try to look up the specifics. I don't know them from memory.
 
Thank you for clarifying the problem.

With N observation pairs I believe I can write b=\frac{N \sum x_i y_i - \sum x_i \sum y_i}{N \sum x_i^2 - (\sum x_i)^2}.

I suppose the propagation of error formula \sigma_f^2=\sum (\frac{\partial f}{\partial x_i} \sigma_{x_i} )^2 is then applied to a linear approximation of b?
 
You state a problem where there is an error in measurement for x_i as well as for y_i. In such a problem, people often use "total least squares" regression. I think the computation of the slope in "total least squares" regression is different than in ordinary least square regression, which assumes no error in the measurement of the x_i. I think the formula you gave for b is for ordinary least squares regression.

Of course, one may ask the question: If I fit a straight line to data using the estimator for slope used in ordinary least squares regression and my data also has errors in the x_i then what is the standard deviation of this estimator. If that's the question, you need terms involving \frac{\partial f}{\partial y_i} \sigma^2_{y_i} and \frac{\partial f}{\partial x_i} \sigma^2 x_i

I don't know if the estimator for slope in ordinary least squares regression is an unbiased estimator if there are errors in the x_i.
 

Similar threads

  • · Replies 11 ·
Replies
11
Views
5K
  • · Replies 19 ·
Replies
19
Views
3K
  • · Replies 64 ·
3
Replies
64
Views
6K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
3
Views
5K
  • · Replies 6 ·
Replies
6
Views
3K