Statistics proof: y = k x holds for a data set

  • #1
avicenna
96
8
Simple linear regression statistics:

If I have a linear relation (or wish to prove such a relation): y = k x where k = constant. I have a set of n experimental data points ...(y0, x0), (y1, x1)... measured with some error estimates.

Is there some way to present how well the n data points shows that the relation: y = kx is proven. What I have in mind is that the regression line will give an error intercept of the Y-axis, say e. Say e = 1.0 x 10^-5. What is the "confidence" for this error estimate.

I want to show error e to be very small say <1.0 10^-7. If I the measurement errors of (yi,xi) ... are very small, how will it help to show y=kx to be "very good" where y=k(1+e)x where e is very small.
 
Mathematics news on Phys.org
  • #2
Google is your friend.

I learned about ##\chi^2## as a measure of goodness of fit. But that was long ago...

[edit] by the way, a visual inspection of the resluts (with error bars) is also a very good idea. Make sure all systematic errors are omitted when drawing the error baars
 
  • #3
BvU said:
Google is your friend.

I learned about ##\chi^2## as a measure of goodness of fit. But that was long ago...

[edit] by the way, a visual inspection of the resluts (with error bars) is also a very good idea. Make sure all systematic errors are omitted when drawing the error baars
Thanks. I think I now have some idea of what I really wanted. It is not simple straightforward as I thought.
 
Back
Top