Gradient and parallel points of a function

In summary, the set of points where the gradient of f is parallel to u = (1,1) for f(x,y) = x^2 + y^3 + 2xy is x = 1/2 and y = 0. This can be found by setting the partial derivatives of f equal to the components of u and solving for x and y. However, note that grad(f) and u do not need to be equal, they just need to be parallel, so the solution can be any scalar multiple of u.
  • #1
wolfgangjt
8
0

Homework Statement


Find the set of points where the gradient of f is parallel to u =(1,1) for
f(x,y)=x2 + y3 + 2xy

Homework Equations


the gradient of f(x,y)=((partial derivative of f wrt x), (partial derivative of f wrt y))
u=grad f

The Attempt at a Solution


fx = 2x+2y
fy = 3y+2x

1=2x+2y
1=3y+2x

x=1/2
y=0

I set the partials equal to the respective values of u and solved and got one set of points. I have a feeling this is wrong. I got x= 1/2 and y= 0

I looked through the notes and cannot find anything to help me with this so I came here in hope of getting pointed in the right direction.
 
Last edited:
Physics news on Phys.org
  • #2
You don't need grad(f) = u, they're just parallel, so grad(f) = c*u, for any scalar c.
 

Related to Gradient and parallel points of a function

1. What is a gradient of a function?

The gradient of a function is a vector that represents the rate of change or slope of the function at a specific point. It is calculated by taking the partial derivatives of the function with respect to each variable and combining them into a vector.

2. How is the gradient related to parallel points of a function?

The gradient of a function at a point is perpendicular to the level curves of the function passing through that point. This means that parallel points on a function will have the same gradient vector, as they lie on the same level curve.

3. Can a function have multiple gradient vectors?

Yes, a function can have multiple gradient vectors at different points. This is because the gradient of a function changes as the point of observation changes.

4. What is the significance of the gradient in optimization problems?

The gradient is crucial in optimization problems as it represents the direction of steepest ascent or descent. This allows for efficient and effective methods to be used for finding maximum or minimum values of a function.

5. How is the gradient used in machine learning algorithms?

In machine learning, the gradient is used in optimization algorithms such as gradient descent to find the optimal parameters for a model. It is also used in backpropagation for updating the weights of a neural network during training.

Similar threads

  • Calculus and Beyond Homework Help
Replies
8
Views
518
  • Calculus and Beyond Homework Help
Replies
6
Views
891
  • Calculus and Beyond Homework Help
Replies
2
Views
580
  • Calculus and Beyond Homework Help
Replies
8
Views
906
  • Calculus and Beyond Homework Help
Replies
18
Views
1K
  • Calculus and Beyond Homework Help
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
490
  • Calculus and Beyond Homework Help
Replies
4
Views
725
  • Calculus and Beyond Homework Help
Replies
5
Views
660
  • Calculus and Beyond Homework Help
Replies
1
Views
480
Back
Top