Can you find the gradient of a vector?

  • Thread starter Thread starter anban
  • Start date Start date
  • Tags Tags
    Gradient Vector
Click For Summary
The discussion centers on the concept of finding the gradient of a vector function, which is generally not conventional as gradients are typically defined for scalar fields. Participants explore whether the gradient operation can be applied to a vector, suggesting that it may lead to a matrix if applied to each component. They clarify that while the gradient is used for scalars, operations like divergence and curl serve as vector counterparts. The conversation emphasizes that the question may be intended to provoke critical thinking rather than to imply a straightforward mathematical operation. Understanding tensors may be necessary for a deeper interpretation of gradients applied to vectors.
anban
Messages
20
Reaction score
0

Homework Statement



I know you can find the gradient of a scalar using partial derivatives. Does it make sense to find the gradient of a vector, however?

A homework problem of mine asks to find the gradient of a vector. I'm starting to think it's a trick question...

Homework Equations



∇ dot V = the divergence of V
∇ cross V = the curl of V

The Attempt at a Solution



The equations above lead me believe that it doesn't make sense to take the gradient of a vector , but the gradient operator can be used in combination with a dot product or cross product to give similar information about the way a function behaves (divergence and curl). So, perhaps divergence and curl are like the vector version of a gradient?
 
Physics news on Phys.org
V∇ !
Its not a trick question!
 
Last edited:
Sorry, my above comment makes no sense, that was the gradient of a scalar which you mentioned, I read too fast! I'm curious too...maybe someone more math minded can help?...
 
Last edited:
You can make the gradient of vector make some sense if you know what a tensor is. If not, then you may have misinterpreted the question. What is it?
 
The specific question is "You are given some vector function V(x,y,z). Can the gradient operation be operated on V? If so, how would you interpret the result?" Very vague, I know.

I have not yet learned what a tensor is. My teacher is definitely one to give tricky questions, so I'm sort of just thinking this is a way to teach us that the gradient operation is used on scalars, and the divergence and curl operations are used on vectors. Is this correct? The way I make sense of this in my head is with an analogy like divergence & curl : vector as gradient: scalar. What do you think?
 
anban said:
The specific question is "You are given some vector function V(x,y,z). Can the gradient operation be operated on V? If so, how would you interpret the result?" Very vague, I know.

I have not yet learned what a tensor is. My teacher is definitely one to give tricky questions, so I'm sort of just thinking this is a way to teach us that the gradient operation is used on scalars, and the divergence and curl operations are used on vectors. Is this correct? The way I make sense of this in my head is with an analogy like divergence & curl : vector as gradient: scalar. What do you think?

Ok, so it's just a thinking question. Gradient gives you a vector from a scalar. If you want to take the gradient of a vector you might think about taking the gradient of each component giving you a matrix.
 
anban said:

Homework Statement



I know you can find the gradient of a scalar using partial derivatives. Does it make sense to find the gradient of a vector, however?

A homework problem of mine asks to find the gradient of a vector. I'm starting to think it's a trick question...

Homework Equations



∇ dot V = the divergence of V
∇ cross V = the curl of V

The Attempt at a Solution



The equations above lead me believe that it doesn't make sense to take the gradient of a vector , but the gradient operator can be used in combination with a dot product or cross product to give similar information about the way a function behaves (divergence and curl). So, perhaps divergence and curl are like the vector version of a gradient?

One way to think about the gradient is as a "linearization factor", so if v(x,y,z) is a scalar function we have
v(x+h_x, y+h_y z+h_z) = v(x,y,z) + &lt;A,h&gt; + O(|h|^2), where A is a vector, h = (h_x,h_y,h_z) is a vector and <.,.> denotes the inner product. If that holds for all h, we must have A = grad v. Can you think of a similar representation when v is a vector?
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 9 ·
Replies
9
Views
2K
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
Replies
12
Views
9K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
4K