Gradient of a Vector: Scalar or Vector?

  • Thread starter Thread starter hoomanya
  • Start date Start date
  • Tags Tags
    Gradient Vector
Click For Summary
The gradient of a vector does not yield a scalar or a vector in the traditional sense, as gradients are defined for scalar functions. Instead, the gradient of a vector field can be expressed as a rank-2 tensor, represented in component notation. The divergence of a vector field results in a scalar, while the curl produces a vector. The discussion clarifies that the "del" operator can be applied in various ways to vector functions, leading to different mathematical interpretations. Understanding these distinctions is crucial for proper application in vector calculus.
hoomanya
Messages
86
Reaction score
0
Hi,
Just a simple, quick question:
Does the gradient of a vector give a scalor or a vector?
Thanks!
 
Physics news on Phys.org
hi hoomanya! :smile:
hoomanya said:
Does the gradient of a vector give a scalor or a vector?

there's no https://www.physicsforums.com/library.php?do=view_item&itemid=11" of a vector

gradients are of scalars

for a scalar f, ∇f is the gradient of f

for a vector V, ∇V has no meaning (but ∇.V is the divergence, and ∇xV is the curl) :wink:
 
Last edited by a moderator:
Thank you very much for your quick reply. :)
 
You can, of course, have
\nabla\cdot \vec{\phi}(x, y, z)= \frac{\partial f}{\partial x}+ \frac{\partial g}{\partial y}+ \frac{\partial h}{\partial z}
the "divergence" of the vector valued function, \vec{\phi}(x, y, z), which is a scalar, or
\nabla\cdot \vec{\phi}= \left(\frac{\partial g}{\partial z}- \frac{\partial h}{\partial y}\right)\vec{i}+ \left(\frac{\partial f}{\partial z}- \frac{\partial h}{\partial x}\right)\vec{j}+ \left(\frac{\partial g}{\partial x}- \frac{\partial f}{\partial y}\right)\vec{i}
the "curl" of the vector valued function, \vec{\phi}(x, y, z), which is a vector.

Perhaps that is what you are thinking of. There are three kinds of vector "multiplication" and so three ways we can attach the "del" operator to a function.
 
I'd say there's a perfectly good definition for the gradient of a vector.
it's a rank-2 cartesian tensor.

For the vector \vec{A} = A_x \hat{i} + A_y \hat{j} + A_z \hat{k}

we have the beast \nabla \vec{A}<br /> =\nabla A_x \hat{i} + \nabla A_y \hat{j} + \nabla A_z \hat{k}

such that for any vector \vec{v}
\vec{v} \cdot \nabla \vec{A} <br /> = (\vec{v} \cdot \nabla A_x) \hat{i} + (\vec{v} \cdot \nabla A_y) \hat{j} <br /> + (\vec{v} \cdot \nabla A_z) \hat{k}<br />

This is more clear in component notation
\vec{A} \rightarrow A^i
then
\nabla \vec{A} \rightarrow (\nabla A)^i_j<br /> = \frac{\partial A^i}{\partial x^j}.

where the product with the vector \vec{v} above is really
contraction on the second index.
\vec{v} \cdot \nabla \vec{A} \rightarrow <br /> \sum_j \quad \left( v^j \frac{\partial A^i}{\partial x^j} \right)
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 20 ·
Replies
20
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 18 ·
Replies
18
Views
3K