deccard
- 29
- 0
I have measurement dataset (x_i,y_i) -pairs with error for each value \Delta x_i and \Delta y_i so that I can plot datapoints with vertical as well as horizontal errorbar. I want to fit linear regression line y=a_1 x + a_0 and also error lines to the data.
But how I take into account as I'm fitting regression line that each datapoint will have its error in x- and y-direction?
And what about the error lines so that I get min and max value of a_1, a_0. I could use standard deviation, but then again this does not take into account the errors \Delta x_i and \Delta y_i.
This picture enlightens my problem
http://www.chemistry.adelaide.edu.au/external/soc-rel/content/images/graph_er.png"
I'm interested only in mathematical ways to do this. I already know how to do this by hand. Especially any Matlab example would be greatly appreciated.
But how I take into account as I'm fitting regression line that each datapoint will have its error in x- and y-direction?
And what about the error lines so that I get min and max value of a_1, a_0. I could use standard deviation, but then again this does not take into account the errors \Delta x_i and \Delta y_i.
This picture enlightens my problem
http://www.chemistry.adelaide.edu.au/external/soc-rel/content/images/graph_er.png"
I'm interested only in mathematical ways to do this. I already know how to do this by hand. Especially any Matlab example would be greatly appreciated.
Last edited by a moderator: