In the strictest sense, the "derivative" of a function of three variables, f(x,y,z), at (x_0, y_0, z_0), is the linear function, from R^3 to R, U(x,y,z)= f_{x}(x_0,y_0,z_0)(x- x_0)+ f_y(x_0, y_0, z_0)(y- y_0)+ f_z(x_0,y_0,z_0)(z- z_0) which we can think of as the vector dot product <f_x(x_0, y_0,z_0), f_y(x_0,y_0,z_0), f_z(x_0, y_0, z_0)>\cdot<(x- x_0, y-y_0, z-z_0> and so can be "represented" by the gradient vector <f_x(x_0, y_0,z_0), f_y(x_0, y_0, z_0), f_z(x_0, y_0, z_0)>.
In that same sense, the second derivative of a function of three variables is the linear function from R^3 to the set of such gradient vectors which are themselves in R^3 which can be represented by the three by three matrix:
\begin{bmatrix}f_{xx} & f_{xy} & f_{xz} \\ f_{yx} & f_{yy} & f_{yz} \\ f_{zx} & f_{zy} & f_{zz}\end{bmatrix}
Now, because the mixed derivatives are equal: f_{xy}= f_{yx}, f_{yz}= f_{zy}, and f_{xz}= f_{zx}, that is a symmetric matrix which means it is diagonalizable. That is, there exist some coordinates system, x', y', z', in which all mixed derivatives are 0 and the matrix is
\begin{bmatrix}f_{x'x'} & 0 & 0 \\ 0 & f_{y'y'} & 0 \\ 0 & 0 & f_{z'z'}\end{bmatrix}
where those second derivatives, f_{x'x'}, f_{y'y'}, f_{z'z'}, evaluated at (x_0, y_0, z_0) are the eigenvalues of the matrix.
And that, in turn, means that in that coordinate system we can write
f(x',y',z')= f(x_0,y_0, z_0)+ f_{x'x"}(x'- x_0)^2+ f_{y'y'}(y'- y_0)^2+ f_{z'z'}(z'- z_0)^2 Now we can see: if all of those eigenvalues are positive, (x-0, y_0, z_0)[/tex] is a minimum, if all negative, a minimum, if some positive and some negative then a saddle point. Now, if this were <b>two</b> variables, x and y, say, our matrix would be 2 by 2:<br />
\begin{bmatrix}f_{xx} &amp; f_{xy} \\ f_{yx} &amp; f_{zz}\end{bmatrix}<br />
or, in the x', y', z' coordinate system in which it is diagonal,<br />
\begin{bmatrix}f_{x&#039;x&#039;} &amp; 0 \\ 0 &amp; f_{y&#039;y&#039;}\end{bmatrix}<br />
Notice that that last matrix has determinant f_{x&#039;x&#039;}f_{y&#039;y&#039;} and so is positive if and only if f_{x&#039;x&#039;} and f_{y&#039;y&#039;} have the same sign and negative if and only if they have different sign. But the determinant is independent of the coordinate system so we can say that f has a saddle point if and only if f_{xx}f_{yy}- f_{xy}^2&lt; 0 and a max or min if it is positive (in that case f_{xx} and f_{yy} must have the same sign so you can check either to see whether it is a max or min). <br />
<br />
Unfortunately, it isn't that easy with three variables. If the determinant is positive, it might be that all three eigenvalues are positive (so a minimum) or that one is positive and the other two negative (a saddle point) or if the determinant is negative, it might be that all three eigenvaues are negative (so a maximum) or that one is negative and the other two positive (a saddle point). You really need to identify all three eigenvalues of the "second derivative matrix" in order to know what you have.