- #1

- 30

- 0

I have measurement dataset [tex] (x_i,y_i) [/tex] -pairs with error for each value [tex] \Delta x_i [/tex] and [tex] \Delta y_i [/tex] so that I can plot datapoints with vertical as well as horizontal errorbar. I want to fit linear regression line [tex]y=a_1 x + a_0[/tex] and also error lines to the data.

But how I take into account as I'm fitting regression line that each datapoint will have its error in x- and y-direction?

And what about the error lines so that I get min and max value of [tex]a_1, a_0[/tex]. I could use standard deviation, but then again this does not take into account the errors [tex] \Delta x_i [/tex] and [tex] \Delta y_i [/tex].

This picture enlightens my problem

http://www.chemistry.adelaide.edu.au/external/soc-rel/content/images/graph_er.png"

I'm interested only in mathematical ways to do this. I already know how to do this by hand. Especially any Matlab example would be greatly appreciated.

But how I take into account as I'm fitting regression line that each datapoint will have its error in x- and y-direction?

And what about the error lines so that I get min and max value of [tex]a_1, a_0[/tex]. I could use standard deviation, but then again this does not take into account the errors [tex] \Delta x_i [/tex] and [tex] \Delta y_i [/tex].

This picture enlightens my problem

http://www.chemistry.adelaide.edu.au/external/soc-rel/content/images/graph_er.png"

I'm interested only in mathematical ways to do this. I already know how to do this by hand. Especially any Matlab example would be greatly appreciated.

Last edited by a moderator: