richardc
- 7
- 1
Finding the Uncertainty of the Slope Parameter of a Linear Regression
Suppose I have measurements x_i \pm \sigma_{xi} and y_i \pm \sigma_{yi} where \sigma is the uncertainty in the measurement. If I use a linear regression to estimate the value of b in y=a+bx, I'm struggling to find a straightforward way to compute the uncertainty of b that arises from the measurement uncertainties. This seems like it should be a very common problem, so I'm not sure why I can't find a simple algorithm or formula.
Thank you for any advice.
Suppose I have measurements x_i \pm \sigma_{xi} and y_i \pm \sigma_{yi} where \sigma is the uncertainty in the measurement. If I use a linear regression to estimate the value of b in y=a+bx, I'm struggling to find a straightforward way to compute the uncertainty of b that arises from the measurement uncertainties. This seems like it should be a very common problem, so I'm not sure why I can't find a simple algorithm or formula.
Thank you for any advice.
Last edited: