Weird theorem on critical points for multivariable functions

Click For Summary
The discussion centers on the necessity of the mixed second derivative term fxy2 in the analysis of critical points for multivariable functions. It is established that if the first derivatives at a point are zero and the product of the second derivatives is positive, the point is a local minimum or maximum, while a negative product indicates a saddle point. The role of the Hessian matrix is emphasized, as its eigenvalues determine the nature of the critical point, with specific conditions for minima, maxima, and saddle points outlined. The conversation also touches on the limitations of applying similar analysis to functions of three or more variables, where the determinant does not provide clear information about individual eigenvalues. Overall, the importance of the mixed derivatives and the Hessian in understanding the behavior of multivariable functions is underscored.
Nikitin
Messages
734
Reaction score
27
Why does it write all that stuff about fxy2? Isn't it unnecessary?

I mean, isn't the following true?

If in a point (a,b): fx=fy=0 and fxxfyy>0, then the partial derivatives must be either both negative or positive, and thus point (a,b) is a local minima or maxima. And if fxxfyy<0 => (a,b) is a saddlepoint.

Right? So why bring in the fxy2?
 

Attachments

  • bilde(2).JPG
    bilde(2).JPG
    31.3 KB · Views: 503
Physics news on Phys.org
f(x,y) = x^2 + 100xy +y^2

This has a saddle point at the origin. If you look at f(t,t) you get 102t^2, which is positive. If you look at f(t,-t) you get -98x^2 which is negative. So you have both positive and negative values near the origin.
 
Hmm, thanks. But is there an intuitive explanation for the theorem?
 
A more precise way of thinking about it is this: if f(x,y) is a function of two variables, the its "derivative" is NOT just the partial derivatives, \partial f/\partial x and \partial f/\partial y (it is possible for those partial derivatives to exist at a point while f itself is not even continuous, much less differentiable), but the gradient \nabla f= (\partial f/\partial x)\vec{i}+ (\partial f/\partial y)\vec{j}. (Even more precisely, the derivative is the linear transformation, at each point, given by the dot product (\partial f/\partial x)\vec{i}+ (\partial f/\partial y)\vec{j})\cdot ((x- x_0)\vec{i}+ (y- y_0)\vec{j}) but that is "represented" by the gradient vector.)

In that sense the second derivative is given by the linear transformation "represented", at each point, by the matrix
\begin{bmatrix}\frac{\partial^2 f}{\partial x^2} &amp; \frac{\partial^2 f}{\partial x\partial y} \\ \frac{\partial^2 f}{\partial y\partial x} &amp; \frac{\partial^2 f}{\partial y^2}\end{bmatrix}

Because, as long as f has continuous derivatives, the "mixed" second derivatives are the same, that is a symmetric matrix and so has real eigenvalues and two independent eigenvectors. If we were to use the directions of those eigenvectors as coordinate lines, x' and y', the matrix representing the second derivative would be "diagonal":
\begin{bmatrix}\frac{\partial^2 f}{\partial x&#039;^2} &amp; 0 \\ 0 &amp;\frac{\partial^2 f}{\partial y^2}\end{bmatrix}
where those two derivatives (evaluated at the given point) are the "eigenvalues" of the original second derivative matrix.

Now, at a point where the first derivatives are 0 (a critical point) and the "mixed" second derivatives are 0, as in the x', y' coordinate system, we can write f(x)= f(x_0, y_0)+ f_{xx}(x_0,y_0)(x- x_0)^2+ f_{yy}(x_0, y_0)(y- y_0)^2 to second degree. And it is easy to see from this that:
1) if f_{xx}(x_0, y_0)= a and f_{yy}(x_0, y_0)= b are both positive, we have f(x, y)= f(x_0, y_0)+ a(x- x_0)^2+ (y- y_0)^2 so that (x_0, y_0) is a local "minimum".
2) if f_{xx}(x_0, y_0)= -a and f_{yy}(x_0, y_0)= -b are both negative, we have f(x, y)= f(x_0, y_0)- a(x- x_0)^2- b(y- y_0)^2 so that (x_0, y_0) is a local "maximum".
3) if f_{xx}(x_0, y_0)= a and f_{yy}(x_0, y_0)= -b are one positive and the other negative, we have f(x, y)= f(x_0, y_0)+ a(x- x_0)^2- b(y- y_0)^2 so that (x_0, y_0) is a local "saddle point".

So the question is about the eigenvalues of that two by two matrices. If both are positive, the point is a local minimum, if both are negative, a local maximum, and if they are of different sign, a saddle point (of course, just like in the one variable situation, if either is 0, this does not tell us). Further the determinant is independent of the coordinate system- the two determinants:
\left|\begin{array}{cc}\frac{\partial^2 f}{\partial x&#039;^2} &amp; 0 \\ 0 &amp; \frac{\partial^2 f}{\partial y&#039;^2}\end{array}\right|= f_{x&#039;x&#039;}f_{y&#039;y&#039;}
\left|\begin{array}{cc}\frac{\partial^2 f}{\partial x^2} &amp; \frac{\partial^2 f}{\partial x\partial y} \\ \frac{\partial^2 f}{\partial y\partial x} &amp; \frac{\partial^2 f}{\partial y^2}\end{array}\right|= f_{xx}f_{xy}- (f_{xy})^2
are the same- both eigenvalues, and so second derivatives, are the same sign and so we have either a minimum or a maximum, if and only if f_{xx}f_{yy}- (f_{xy})^2&gt; 0 and the second derivatives are of different sign, and so we have a saddle point, if and only if f_{xx}f_{yy}- (f_{xy})^2&lt; 0.

This also shows why we do not have a similar formula for three or more variables- all the analysis, to the diagonal matrix goes through but the determinant is a product of three or more numbers and its sign does not tell us about the sign of the individual eigenvalues. If, in the three variable case, the product is positive, it might be that all three eigenvalues are positive or that one is positive and the other two negative.
 
Last edited by a moderator:
thanks!
 

Similar threads

  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 18 ·
Replies
18
Views
3K
  • · Replies 17 ·
Replies
17
Views
2K