- #1
fhqwgads2005
- 23
- 0
Hi y'all, wondering if you could help me with this. I have a data set with a linear relationship between the independent and dependent variables. Both the depended and independent variables have error due to measurement and this error is not constant.
For example,
{x1, x2, x3, x4, x5}
{y1, y2, y3, y4, y5}
{dx1, dx2, dx3, dx4, dx5}
{dy1, dy2, dy3, dy4, dy5}
where one data point would be (x1±dx1, y±dy1), and so on.
Assuming the relationship is of the form,
y = ax + b, I need both the best value for a, and its uncertainty, (a ± da).
I've been scouring the internet for more information on total least squares methods, and generalized method of moments, etc. but I can't find something that works for the case where the error in x and y is just some arbitrary value, like in my case.
helpful hints?
For example,
{x1, x2, x3, x4, x5}
{y1, y2, y3, y4, y5}
{dx1, dx2, dx3, dx4, dx5}
{dy1, dy2, dy3, dy4, dy5}
where one data point would be (x1±dx1, y±dy1), and so on.
Assuming the relationship is of the form,
y = ax + b, I need both the best value for a, and its uncertainty, (a ± da).
I've been scouring the internet for more information on total least squares methods, and generalized method of moments, etc. but I can't find something that works for the case where the error in x and y is just some arbitrary value, like in my case.
helpful hints?