- #1
BillKet
- 312
- 29
Hello! I need to perform a fit with several variables and 2 of them are very correlated (above 0.99). The functional form of these 2 variables is something like: ##(p+q)x+qf(x)##, where ##f(x)## contains polynomials and some square roots of x, but the coefficients appearing in ##f(x)## are much smaller than one, for example something like ##10^{-7} x^2## (for completeness, but not very relevant to my questions, this is from fitting the p and q parameters of the lambda doubling in a ##^2\Pi_{1/2}## state in a diatomic molecule). If I keep both p and q as free variables, I end up with some values around p=0.1 and q=0.001 with the error for both on the order of 0.0001 and a very good RMS error for the points used for the fit. If I set q=0 and fix it at zero, the uncertainty on p becomes 10 times smaller, but the RMS error is about 50% bigger. I also tried to fix q at the fitted value i.e. q=0.001 and fit just for p. In this case the RMS was as good as initially (even slightly better) and the uncertainty on p was 10 times smaller than initially. I am not sure what is the best way to present my results. If I let both p and q to vary, the uncertainty on p is big, but it feels like that doesn't reflect the truth, as that error is mainly influenced by q, as they appear as p+q. If I fixed q=0.001, the errors on q and p would be different by a factor of 10 and I am not sure if that makes sense mathematically, as they do appear as p+q. Can someone advice me on what is the best way to proceed? Thank you!