Uncertainty on best fit gradient

In summary, the conversation discusses the issue of error bars on data points and the software's ability to account for them in its calculations. It is questioned whether the software only looks at the fit of the data points to determine the standard error or if it takes into consideration the size of the error bars. The conversation also mentions the possibility of giving made-up data to test the software's accuracy and references additional resources for further understanding of regression and uncertainty estimates.
  • #1
kop442000
34
0
Hi everyone,

I have a plot of some data points that have error bars on the y axis.

A bit of software I am using gives me the best fit gradient and a "Standard Error", but it doesn't take the size of error bars into consideration. I'm assuming that it just looks at how well the gradient fits the data points to give the standard error on the gradient.

But doesn't one need to take into consideration the size of the error bars on the data points??

Thank you.
 
Physics news on Phys.org
  • #2
It does help to know how your software arrives at it's conclusions.

Perhaps the program just uses the statistics from the data points to figure an uncertainty in it's estimations. You can imagine if you had inaccurate measurements which just happened to fall close to a line, then the program would provide a very small error. You can test by giving it made-up data.

http://www.learningmeasure.com/articles/mathematics/LeastSquaresLineFits.shtml

Example including error-bars, using mathematica:
http://mathematica.stackexchange.co...egression-given-data-with-associated-uncertai
 
Last edited:
  • #3
Thanks Simon.

Yes it seems that the software (within Python) just does a least squares fit, compares the data points to the best fit gradient and let's you know how well they fit. I wanted something that took into account the error bars on the data too - for the reason that you mentioned above.

Statistically speaking, a gradient uncertainty that took into account the error bars on the data points would be better right?

I'll check out both the links - thank you!
 
Last edited:
  • #4
Statistically speaking, a gradient uncertainty that took into account the error bars on the data points would be better right?
I'm not actually sure ... the assumption behind the least-squares is that the data comes from a normal distribution so the uncertainty estimate will be based on a distribution of means.

Sze Meng Tan's Inverse Problems lecture notes provide in in-depth into regression in one of the chapters... the notes are available here:
http://home.comcast.net/~szemengtan/
 
Last edited by a moderator:
  • #5
Thank you for your help!
 

What is uncertainty on best fit gradient?

Uncertainty on best fit gradient refers to the degree of error or variability in the slope of a linear regression or best fit line. It represents the range of possible values for the slope, taking into account the limitations and uncertainties of the data and the fitting process.

Why is it important to consider uncertainty on best fit gradient?

Considering uncertainty on best fit gradient is important because it provides a measure of how reliable or accurate the slope of a best fit line is. It allows scientists to understand the potential range of values for the slope and make informed decisions about the significance of the relationship between variables.

How is uncertainty on best fit gradient calculated?

Uncertainty on best fit gradient is typically calculated using statistical methods such as standard error or confidence intervals. These methods take into account the variability and errors in the data points, and provide a range of values for the slope with a given level of confidence.

Can uncertainty on best fit gradient be reduced?

Uncertainty on best fit gradient cannot be completely eliminated, but it can be reduced by increasing the sample size, improving the accuracy and precision of data measurements, and using more advanced statistical techniques for fitting the data.

How does uncertainty on best fit gradient affect the interpretation of results?

Uncertainty on best fit gradient should always be taken into consideration when interpreting the results of a linear regression or best fit line. A larger uncertainty indicates a less precise relationship between the variables, and the significance of the results should be evaluated accordingly.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
917
  • Set Theory, Logic, Probability, Statistics
Replies
28
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
28
Views
3K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
891
  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
16
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
21
Views
2K
Back
Top