Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Judgement of good fit: where to stop?

  1. Feb 23, 2008 #1


    User Avatar

    I have to fit a curve to predict Y on the basis of X. 50 pair of (x,y) is given. On free hand plot the curve it is seen (and otherwise known also) that Y and X has 1:1 relationship. The basic nature of movement of Y is quite composite and found involving x^a, sin(c+bx), d^x , one additive constant, and different constant multipliers of the first 3 terms. Here constants are real valued .
    That is,
    Y= p + q.x^a+ r.sin(c+dx) + s.d^x
    What I did is to start with arbitrary values of a,b,c,d and find p,q,r,s by least squares and calculate the residual sum of squares (rss).
    Now I varied 'a' and compared rss till it is minimum. Then repeated the same with others, one at a time and came back to 'a' ..and so on. This resulted in a nice fit but the rss value never seem to stabilize.. it is decreasing (but of course it is not becoming 0).

    My question is when should I stop ... is there any objective method or a value of rss (or value of R^2) which can be used as cut off when I can say the fit is satisfactory?
    PS: Frequency distributions cannot be formed to test goodness of fit.

    Any idea is appreciated.
  2. jcsd
  3. Feb 24, 2008 #2
  4. Feb 25, 2008 #3
    BTW - just for fun, could you provide the (x,y) data? This would be an interesting and educational (for me, at least) exercize for playing and tweaking with Mathematica's "Non-linear Regression" package. I'm kind of curious to fool around with it in OriginPro, too. I've never messed with their non-linear stuff.

  5. Feb 25, 2008 #4

    D H

    Staff: Mentor

    You have a number of plausible explanations of the variance in the data. The difficulties you are facing are (1) that these explanations are not independent of one another, and (2) the models are non-linear.

    One approach is to address the problem is to start with nothing (i.e., the data are just random numbers). One of the plausible explanations will most likely do a better job than any of the other explanations at reducing the residual variance. Keep repeating this step -- i.e., add one explanation at a time to your overall model. Stop when the residual variance can be attributed to measurement noise with a high degree of confidence. Note well: This means you need some kind of model of the measurement process.

    Another approach is to start with a full model. Throw the kitchen sink at the problem. The first approach worked by adding terms step-by-step. This approach works by subtracting terms. Find the model term such that removing it does the least damage to the residual variance. Suppose that the residual variance can still be attributed with high confidence to measurement noise after removing this least-powerful term. That means that this term is not significant. Throw it out. Repeat throwing out terms until you finally come up against a term that is significant.
  6. Feb 26, 2008 #5


    User Avatar

    Sorry, it cannot be done as the data is not for public and is maintained as secret for people who are not permitted to use it.
  7. Feb 26, 2008 #6


    User Avatar

    Thanks for your opinion but the apporoach like orthogonal polynomials do not work here. The basic shape has been identified in the form as I mentioned. The question is of satisfactory (objective) degree of accuracy only.
    Last edited: Feb 26, 2008
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: Judgement of good fit: where to stop?
  1. Goodness of fit tests (Replies: 2)