# Regression Analysis - Constructing a Model ((Need Help Please))

1. Feb 12, 2014

### Fjolvar

Hello,

I am trying to construct a general function/method based on two sets of minimum/maximum data point constraints, which can take on new values in different situations. The only known data for this general function is the starting point (y-axis intercept) and the x-range. The rate of change over time must equal zero, so the amount increased/decreased must be compensated within the x-range. Optimally will vary as little as possible, as long as it meets the constraints.

I attached a figure of an example plot. The minimum constraint data points are marked as the 'necessary' function. The maximum constraint data points are marked as 'maximal.' The function marked as 'case 2' is an example of the general function that would satisfy the constraints.

I would appreciate any advice or suggestions on how to approach this problem as I've had very little success so far. Thank you in advance.

#### Attached Files:

• ###### Example Graph.jpg
File size:
34.8 KB
Views:
105
Last edited: Feb 12, 2014
2. Feb 13, 2014

### Stephen Tashi

In my opinion, a procedure for solving this general type of problem would boil down to having a computer program that minimizes a function of several variables subject to many constraints using a numerical method. There are many methods known for solving this type of problem, but they are not methods where you apply a relatively simple formula that produces the answer.

Let my try to interpret your description of the problem as mathematics.

We are given data for two functions $y_{lower}, y_{upper}$ that are defined on $N$ points ${x_0,x_1,...x_N}$. We want to find a function $f(x)$ satisfying:

$N$ inequality constraints:

$y_{lower}(x_i) \lt f(x_i) \lt y_{upper}(x_i)$ for $i = 1,2..N$

Two equality constraints:

$f(x_0) = f(x_N) = y_0$ for some number $y_0$

You want $f(x)$ to vary as little as possible One way to express that in mathematics is to say that we want to minimize the mean square distance between $f(x)$ and the line between $(x_0,y_0)$ and $(x_n,y_0)$.

So the "objective function" is $\frac{1}{N} \sum_{i=1}^N (f(x_i) - y0)^2$

If you know the mathematical definition of "a function", you know that functions can be defined with symbols or words and combinations thereof. So trying to solve the above problem for "the" best function is intractable because there is such a great variety in functions.

To make the problem tractable. we could assume it comes from a family of functions, such as quadratic function, that is defined by a few parameters.

If we assume $f(x) = Ax^2 + Bx + C$ and also treat $y_0$ as a variable, the problem becomes:

Minimize $\frac{1}{N} \sum_{i=1}^N (f(x_i) - y0)^2$
with respect to the variables, $A,B,C,y_0$
subject to the constraints
$y_{lower}(x_i) \lt f(x_i) \lt y_{upper}(x_i)$ for $i = 1,2..N$
$f(x_0) = f(x_N) = y_0$ for some number $y_0$

This format ( Minimize...with respect to....subject to the constraints) is a standard way to state optimization problems. There are methods of solving such problems that resemble organized forms of trial and error: "simulated annealing", "genetic programming". There are algorithms that rely on calculus and "if ...then..." decisions", such as "the conjugate gradient method", versions of it that handle constraints.