- 58
- 1
Hello! I have 5 data points with errors associated to them ##y_i \pm dy_i## and the corresponding ##x_i## values (which don't have uncertainties associated to them). I need to calculate the difference between the first of these points, ##y_1## and the rest, and fit a straight line to it (basically the plot will be ##\Delta y## vs ##x##). For the other 4 points the error associated with them is just ##d(\Delta y_i)=\sqrt{dy_1^2+dy_i^2}## for ##i## from 2 to 5. About the first point itself, at that value of ##x_1## the ##\Delta y## value should be zero. Also, given that this is the reference point, the error associated to that should be zero, too (right?). Now when I want to make a least square fit, I need to weight the difference between the model and the data by ##1/(d(\Delta y))##. However that is infinity in the case of the first point (which I guess it makes sense, as I am sure that the line should pass through that point). However I am not sure how to make it work numerically i.e. using a fitting program. Should I just replace 0 by something like ##10^{-15}##? Does anyone have any advice on how should I handle this infinity? Thank you!