# Maxima-Minima for a 6 independent variable problem

1. Jul 29, 2009

### e2m2a

I was wondering how complex would it be to determine the maxima-minima extrema for a 6 independent variable function? I am assuming it might be enormous if not untenable being that it is all done in "hyper-space" and neither the dependent or all 6 of the independent variables could be visually graphed simultaneously.

2. Jul 29, 2009

### hotvette

I believe the difficulty depends entirely on the nature of the function and how close the initial starting point is to the sought after min/max point. There are a variety of techniques for finding maxima/minima of functions with several variables.

One option is to set the partial deriveratives of the function equal to zero, which in your case would result in a set of 6 simultaneous equations. If the function is non-linear in the function coefficients, you can try Newton-Raphson to solve them iteratively.

3. Jul 30, 2009

### e2m2a

ok, thanks for the response. However, with 6 independent variables, if one took all the possible combinations or permutations of different incremental values for each variable to let us say infinity, than the total number of combinations or permutations too would approach infinity. It seems it would be almost impossible to solve analytically. I guess what I'm wondering is how could we "prove" that we found a finite set of all the possible extrema points. Is there a theorem that states given a multi-variable function you can only have this many maxima-minimum on a given domain? That is, depending on the function and the relation between the variables, could we prove that there are only x number of extrema and no other extrema for the given 6 variable function? But again, would this theorem be constrained by the domain? What if the domain is the set of all real numbers, an infinite domain? Wouldn't that imply there is an infinite set of maxima-minima points?

4. Jul 30, 2009

### HallsofIvy

No, there is no theorem that say that because it is not true. Some functions do have an infinite number of points of max and min within a finite region.

5. Jul 30, 2009

### HallsofIvy

. For example, in just one variable, f(x)= sin(1/x) has an infinite number of local max and min on (0, 1).

6. Jul 30, 2009

### slider142

If your function is twice continuously differentiable, then (on an open set) you can count on a standard vector calculus theorem: If a is a critical point of f (the derivative is 0 there), then if the quadratic form corresponding to the quadratic terms of the Taylor polynomial at a is positive definite (the form sends nonzero x to positive numbers only), then f(a) is a strict local minimum of f; if it is negative definite it corresponds to a strict local maximum. If the set you are analyzing is not open, you will have to carry on the analysis over the boundary points as well. If there are points where f is not continuously 2nd differentiable (ie., cusps), they will have to be analyzed separately as well, also points where the quadratic form is degenerate (vanishes). For anything more than simple functions, you will probably need Newton-Raphson to analyze the quadratic form anyway.

Last edited: Jul 30, 2009
7. Jul 31, 2009

### e2m2a

Thanks everyone for the feedback. I find this an interesting problem because one has to go blind into the night of hyper-space, raging against the dying light to find one's way around in what could be an infinite collection of mountains and valleys. (Apologies to Dylan Thomas). One could proceed in one direction hopefully to find mountains and valleys, but does direction have any meaning in a 6-space?