Sorry if I'm in the wrong subforum.(adsbygoogle = window.adsbygoogle || []).push({});

This is a rather simple and straightforward question, I hope.

I'm doing a measurement that requires me to do a linear regression on data points to get a value of the slope. The slope is the value of the actual property that I am measuring.

Assuming no uncertainty in the data points that are being fit, can I simply use the standard deviation of the slope (output by fitting software) as the uncertainty in that measurement? Is this standard practice?

I ask because the standard deviation in the slope is quite small and results in an uncertainty that, to me, seems unreasonably small.

**Physics Forums - The Fusion of Science and Community**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Estimating measurement error using error from linear regression

Loading...

Similar Threads - Estimating measurement error | Date |
---|---|

I Measures of Linear Independence? | Dec 14, 2017 |

I Trying to understand least squares estimates | Feb 25, 2017 |

Estimating singular values from QR decomposition | Oct 24, 2015 |

Kalman filter a posteriori estimate covariance matrix | Aug 14, 2015 |

Least Square Estimator | Mar 31, 2015 |

**Physics Forums - The Fusion of Science and Community**