Uncertainty on best fit gradient

Click For Summary
SUMMARY

The discussion centers on the limitations of software that performs least squares fitting without considering the error bars associated with data points. Users express concern that the standard error provided by the software does not account for measurement inaccuracies, potentially leading to misleading gradient uncertainties. The conversation highlights the need for statistical methods that incorporate error bars for more accurate gradient estimations. Resources such as Sze Meng Tan's lecture notes and Mathematica examples are suggested for further exploration of this topic.

PREREQUISITES
  • Understanding of least squares fitting methodology
  • Familiarity with error analysis in data measurement
  • Basic knowledge of statistical distributions, particularly normal distribution
  • Experience with Python for implementing statistical software solutions
NEXT STEPS
  • Research methods for incorporating error bars into gradient calculations
  • Learn about robust regression techniques that account for measurement errors
  • Explore the use of Python libraries such as NumPy and SciPy for statistical analysis
  • Study Sze Meng Tan's Inverse Problems lecture notes for advanced regression techniques
USEFUL FOR

Data analysts, statisticians, researchers in scientific fields, and anyone involved in regression analysis who seeks to improve the accuracy of their gradient estimations by considering measurement uncertainties.

kop442000
Messages
32
Reaction score
0
Hi everyone,

I have a plot of some data points that have error bars on the y axis.

A bit of software I am using gives me the best fit gradient and a "Standard Error", but it doesn't take the size of error bars into consideration. I'm assuming that it just looks at how well the gradient fits the data points to give the standard error on the gradient.

But doesn't one need to take into consideration the size of the error bars on the data points??

Thank you.
 
Physics news on Phys.org
It does help to know how your software arrives at it's conclusions.

Perhaps the program just uses the statistics from the data points to figure an uncertainty in it's estimations. You can imagine if you had inaccurate measurements which just happened to fall close to a line, then the program would provide a very small error. You can test by giving it made-up data.

http://www.learningmeasure.com/articles/mathematics/LeastSquaresLineFits.shtml

Example including error-bars, using mathematica:
http://mathematica.stackexchange.co...egression-given-data-with-associated-uncertai
 
Last edited:
Thanks Simon.

Yes it seems that the software (within Python) just does a least squares fit, compares the data points to the best fit gradient and let's you know how well they fit. I wanted something that took into account the error bars on the data too - for the reason that you mentioned above.

Statistically speaking, a gradient uncertainty that took into account the error bars on the data points would be better right?

I'll check out both the links - thank you!
 
Last edited:
Statistically speaking, a gradient uncertainty that took into account the error bars on the data points would be better right?
I'm not actually sure ... the assumption behind the least-squares is that the data comes from a normal distribution so the uncertainty estimate will be based on a distribution of means.

Sze Meng Tan's Inverse Problems lecture notes provide in in-depth into regression in one of the chapters... the notes are available here:
http://home.comcast.net/~szemengtan/
 
Last edited by a moderator:
Thank you for your help!
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 28 ·
Replies
28
Views
3K
Replies
28
Views
4K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
2
Views
2K
  • · Replies 30 ·
2
Replies
30
Views
4K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K