1. Feb 2, 2005

How is it different (or how upgraded) is it from normal diffrentiation?

2. Feb 2, 2005

### matt grime

Is normal differentiation defined for functions of more than one variable?

3. Feb 2, 2005

### dextercioby

That's an interesting choise of words..."Upgrade"... :tongue2:

Okay.Think of a function who has 2 INDEPENDENT variables $z=z(x,y)$ which means that the values of "x" are totally independent of the values of "y"...U want to calculate the RATE OF CHANGE OF "z" WRT "x"...How do you do that??Simple,u fix "y" (which means that in the "z" function the variable "y" becomes a constant) and then compute the ordinary derivative of "z" wrt "x"...Just like in the case of univariable functions.
Mathematically

$$\frac{\partial z(x,y)}{\partial x} =:\frac{dz(x,y)}{dx}|_{y=const.}$$

Daniel.

P.S.Mathematicians could give a geometric interpretation as well...

4. Feb 2, 2005

5. Feb 2, 2005

### DoubleMike

That gives you a function which describes the rate of change of z only in respect to x, correct?

I've encountered in the past some real-world problems where I needed to optimize a function which took multiple variables.

While at any given point I could optimize the function for a given variable, the "global optimum" proved elusive.

I ran across one method do optimize, a numerical and graphical approach called Monte Carlo simulation, or some such.

6. Feb 2, 2005

### mathwonk

given a function f"R^n-->R, and a point a in R^n, just compose with a linear function R-->R^n taking 0 to a. Then the derivative of the composition R-->R^n-->R is the dirtectional derivative of f in the direction of the velocity vector of the curve

R-->R^n.

If the linear function R-->R^n happens to be ionclusion of one of the standard axes of R^n, we call the derivative of the composite the partial derivative wrt the given axis variable.

i.e. partial derivatives are directional derivatives in the standard axis directions.

7. Feb 2, 2005

### cepheid

Staff Emeritus
Umm, hmm. If I remember my multivariable calc right, this optimization involves setting the gradient of the function equal to zero and then testing a whole bunch of points and doing all sorts of checking to see whether you have a max, min, or saddle point. Wow...It's been a while. Need to review...

8. Feb 2, 2005

### mathwonk

a function is approxiamted most simply by its atngenjt line, the linearv etrm of a taylor series. if that term is zero then the enxt piece of information is the second order etrm of the taylor sseries, i.e. the parabola defined by the second derivatives.

if that parabola is right side up, there is a min, if upside down, there is a max,...

similarly for a function of several variables, the first approximation is by the linear terms of the taylor series, i.e. the tangent plane to the graph. i.e. the plane prthogonal to the gradient vector. iof that vector is zero then the enxt approximation is given by the quadratic terms of the taylor series, i.e. by the approximating quadric surface defined by the second derivatives.

if that quadric surface is a right side up paraboloiud, it is a min, if an upside down paraboloid, it is a max, if a saddle surface, it is neither.

one needs then to know how to recognize these quadric surfaces from their equations

ax^2 + bxy + cy^2, and know what relations between a,b,c, tells you the shape. of course basically the three cases are x^2 + y^2, -x^2 - y^2, and x^2 - y^2.