- 457
- 40
I have 3 equations with two unknowns of the form:
r_i == f_i (g,\beta)
Where g and \beta are the independent unknown variables and r_i are known experimentally, but have some error associated with them, say something like 4.3 \pm 0.3 and there are other errors associated with constants inside the f_i.
My question is how do I solve for the objectively "best" fit to those two parameters. I remember doing something like this back in prob/stats, I believe I do something like a \chi^2 fit but I can't find anything online about it that doesn't concern bins of data points, but rather continuous functions. (Or do I have to simulate data, then fit it)
Can anyone lead me in the right direction? I don't have any of my undergrad books with me.
The equations are all something like :
<br /> r_1 =\frac{\left(g^2 x+g^2y-\frac{\beta }{3}\right)^2}{g^2 }<br />
with some variations on the form. So nothing with unsolvable functions.
EDIT: Oh and I plan on using Mathematica to do this, but I'd rather understand the mathematics first.
r_i == f_i (g,\beta)
Where g and \beta are the independent unknown variables and r_i are known experimentally, but have some error associated with them, say something like 4.3 \pm 0.3 and there are other errors associated with constants inside the f_i.
My question is how do I solve for the objectively "best" fit to those two parameters. I remember doing something like this back in prob/stats, I believe I do something like a \chi^2 fit but I can't find anything online about it that doesn't concern bins of data points, but rather continuous functions. (Or do I have to simulate data, then fit it)
Can anyone lead me in the right direction? I don't have any of my undergrad books with me.
The equations are all something like :
<br /> r_1 =\frac{\left(g^2 x+g^2y-\frac{\beta }{3}\right)^2}{g^2 }<br />
with some variations on the form. So nothing with unsolvable functions.
EDIT: Oh and I plan on using Mathematica to do this, but I'd rather understand the mathematics first.