- #1

- 104

- 0

**Curve-fitting y=Ax^p+C**

Howdy,

So I have some data that I suspect to follow a

[tex]y=x^p+C[/tex]

relationship, where p and C are unknown, real numbers. The y values contain some uncertainties, so I want to use a least squares (or similar) method to fit a curve and quantify the goodness of fit. I actually have a value for y(0) most of the time, so what I have been doing is using that as my value for C, and plotting

[tex]\ln(y-y(0))[\tex]

versus

[tex]\ln x[/tex]

from which I get the gradient p using standard linear regression. This works fine for the most part, but it involves the erroneous assumption that there is no uncertainty in the value for C. It also exaggerates the deviations from this model near x=0, which is problematic when the choice of C is not that great. I thought I could maybe vary C and minimise the residuals of the linearised plot and obtain a best value for C that way, but I'm thinking there's probably a more elegant way.

This seems like it must be a really common issue, but I can't find anything about it anywhere. Any ideas? Cheers :)

Last edited: