Multivariable Calculus Gradient-based question

In summary, the student is trying to find a solution to a system of equations in terms of x and y, where the height of the hill is given by f(x,y). The student is unsure of how to find the solution when dz/dx=2xy-2y and dz/dy=x^2-2x. The student needs to find the steepest path up the hill as an expression in terms of x and y, and then find the curve that is followed continually moving along the gradient at any point. Once the curve is found, the student must solve for t, the parameter that will give them the direction of the steepest path.
  • #1
nivekious
5
0

Homework Statement


Find the steepest path up a hill as an expression in terms of x and y whose height is given by f(x,y)=x^2y-2xy+5 starting at (2,1)


2. The attempt at a solution

I know that I need to get a set of parametric equations for x and y because I have done problems sort of like this before, but those always had only one variable in the partials. I'm unsure of how to get these equations when dz/dx=2xy-2y and dz/dy=x^2-2x.
 
Physics news on Phys.org
  • #2
You know that the gradient will give you the path of steepest increase/decent correct? That is half of the battle already. Now just take the gradient of the equation and evaluate it at that point, which will give you your direction. If you know how to parametrize the path, aka f(x+tv), then you should be good to go. If you don't know how to parametrize the path, let us know, and we'll walk you through it.
 
  • #3
Well, I know that the gradient will give me the gradient at that point, but I need to find the curve that will be followed continually moving along the gradient at any point. I'm not sure what you mean by f(x+vt) (it may just be that this is something we haven't covered yet in class. It's an extra credit problem and the professor likes to do that sometimes.)

So, if you could walk me through that that would be great.
 
Last edited:
  • #4
By path do you mean another real-valued function just like f(x,y) in this example, but just like a 3-D version of a derivative curve in 2-D? I'm just trying to figure out exactly what your professor wants you to do.
 
  • #5
I think that that's pretty much what he wants. I asked him in class the other day and I think he said it could be done by solving dy/dx=(dy/dt)/(dx/dt) but I'm not sure how to get to the parametric equations with more than one variable in the partials
 
  • #6
I believe that you would need to parametrize the curve, then compose it with the gradient vector field.
 
  • #7
If [itex]\nabla F= f(x,y)\vec{i}+ g(x,y)\vec{j}[/itex], then the path having that vector as tangent vector, with parameter t, must satisfy
[tex]\frac{dx}{dt}= f(x,y)[/tex]
[tex]\frac{dy}{dt}= g(x,y)[/itex]
Solve that system of differential equations.
 
Last edited by a moderator:

1. What is the gradient in multivariable calculus?

The gradient in multivariable calculus is a vector that points in the direction of the steepest increase of a function at a given point. It is calculated by taking the partial derivatives of the function with respect to each variable.

2. How is the gradient used in multivariable calculus?

The gradient is used to find the direction and magnitude of the maximum rate of change of a function. It is also used to find the direction of steepest descent, which is important in optimization problems.

3. What is the relationship between the gradient and level curves?

The gradient is always perpendicular to the level curves of a function. This means that the gradient will point in the direction of the steepest increase of the function at a given point, while the level curves represent points with the same function value.

4. Can the gradient be negative?

Yes, the gradient can be negative. The sign of the gradient depends on the direction of the steepest increase of the function at a given point. If the function is decreasing in that direction, the gradient will be negative.

5. How is the gradient used in machine learning?

In machine learning, the gradient is used in gradient descent algorithms to find the minimum of a cost function. By taking small steps in the direction of the negative gradient, the algorithm can iteratively approach the minimum of the function and find the optimal values for the parameters of a model.

Similar threads

  • Calculus and Beyond Homework Help
Replies
2
Views
531
  • Calculus and Beyond Homework Help
Replies
18
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
671
  • Calculus and Beyond Homework Help
Replies
2
Views
452
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
13
Views
251
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
6
Views
840
  • Calculus and Beyond Homework Help
Replies
8
Views
866
  • Calculus and Beyond Homework Help
Replies
13
Views
1K
Back
Top