Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

How do I model a set of random points extrapolated along the x-axis with an equation?

  1. May 4, 2006 #1
    I'd like to know if it's possible to create an equation to model set of points along the x-axis, where each point's y coordinate is an integer between 1 and, say, 30, and where y increases by a constant amount - say, 1 - for each x point. Example points include: (3, 1) (14, 2) (7, 3) and (27, 4) Can an equation be created with a computer program, or by hand, to model such a set of points to the extent that we can regress based on the equation?

    Of course we can use a line of best fit, but can we create an equation to model such a random set of points with precision, i.e. to the extent that we can perform a 'regression' on the equation and extract the above points?

    While this may not be feasible, I'm trying to figure out if it's possible at all.

    I'm thinking that this is possible if just the right equation is created. It may be a long, drawn-out equation, but how do you think I could achieve this? A push in the right direction would be great.

    Thanks! :)
  2. jcsd
  3. May 4, 2006 #2


    User Avatar

Share this great discussion with others via Reddit, Google+, Twitter, or Facebook