Dot and Cross Products (of Gradients)

  • Thread starter Thread starter jeff1evesque
  • Start date Start date
  • Tags Tags
    Cross Dot
AI Thread Summary
The discussion centers on the mathematical identity involving the divergence of the curl of a vector field, specifically whether ∇·(∇×A) equals zero. Participants confirm that ∇·(∇×A) is indeed zero, as the divergence of a curl always results in zero due to the properties of vector calculus. The confusion arises from the interpretation of the Laplacian operator, which is clarified as needing a function to operate on rather than being a standalone entity. Additionally, it is emphasized that the operation ∇·∇ is not valid without a vector field, reinforcing the need for proper context in applying these operators. Overall, the thread highlights fundamental concepts in vector calculus and the importance of understanding operator properties.
jeff1evesque
Messages
312
Reaction score
0
Statement:
I was wondering if the following are identical,
\nabla \bullet \nabla \times \vec{A} = \nabla \bullet (\nabla \times \vec{A}) ? (#1)

Also, more importantly I was wondering if someone could explain to me why the following is zero for any vector,

\nabla \bullet \nabla \times \vec{A} = 0?

Reasoning:
If we look at \nabla \bullet \nabla \times \vec{A}, Isn't \nabla \bullet \nabla the laplacian, or \nabla^{2}? Is that the reason we cannot perform the operation in equation (#1) above- the laplacian is a function that needs a vector field to be operated on, and there is none. So \nabla \bullet \nabla = 0.Thanks, alot,JL
 
Physics news on Phys.org
jeff1evesque said:
Statement:
I was wondering if the following are identical,
\nabla \bullet \nabla \times \vec{A} = \nabla \bullet (\nabla \times \vec{A}) ? (#1)

Also, more importantly I was wondering if someone could explain to me why the following is zero for any vector,

\nabla \bullet \nabla \times \vec{A} = 0?

Reasoning:
If we look at \nabla \bullet \nabla \times \vec{A}, Isn't \nabla \bullet \nabla the laplacian, or \nabla^{2}? Is that the reason we cannot perform the operation in equation (#1) above- the laplacian is a function that needs a vector field to be operated on, and there is none. So \nabla \bullet \nabla = 0.


Thanks, alot,


JL

Yeah they're identical. Check out http://en.wikipedia.org/wiki/Triple_product" on what's called the "triple product."
 
Last edited by a moderator:
\nabla \bullet \nabla \times \vec{A} = 0 = div(curl(A))

I mean if you think of it in terms of geometrical vectors, then if we're looking at the divergence of the amount of A which is contained it sort of intuitively seems like it should be zero.

If it's too hard to visualize then just work it out. Assume A is a vector function and take the curl of A then the divergence of the curl of A and by Clairaut's theorem you should see that it all cancels out.

Secondly (\nabla \bullet \nabla) \times \vec{A} makes no sense you cannot take the cross product of a scalar operator (which the laplacian is).

Finally the laplacian doesn't equal zero. It's a linear operator, it doesn't equal anything be itself. In fact it really doesn't have any meaning by itself if it's not being applied to a function in euclidean space.
 
It's more complicated then that. Work it out and you'll see the derivatives canceling.
 
Pengwuino said:
It's more complicated then that. Work it out and you'll see the derivatives canceling.

I tried, for example I let \vec{H} = 2xy\hat{x} + x^2y^2z^2\hat{y} + x^3y^3z^3\hat{y}.

When I take the curl,
\nabla \times \vec{H} = [3x^3y^2z^3 - 2x^{2}y^{2}z]\hat{x} + [3x^{2}y^{3}z^{3} - 0]\hat{y} + [2xy^{2}z^{2} - 2x]\hat{z} and let that whole thing equal \alpha

So now when I take the divergence,
\nabla \bullet \alpha = (9x^{2}y^{2}z^{3} - 4xy^{2}z) + 9(x^{2}y^{2}z^{3}) + (4xy^{2}z) = 18(x^{2}y^{2}z^{3}) \neq 0

It seems like it almost worked out, the second term above needed a sign change, but I've reviewed the calculation, and it doesn't seem that I've made a mistake.

...makes no sense you cannot take the cross product of a scalar operator (which the laplacian is).
That was along the lines I was thinking, the laplacian requires a vector to be operated on, and it doesn't have one.
 
Last edited:
jeff1evesque said:
I tried, for example I let \vec{H} = 2xy\hat{x} + x^2y^2z^2\hat{y} + x^3y^3z^3\hat{y}.

When I take the curl,
\nabla \times \vec{H} = [3x^3y^2z^3 - 2x^{2}y^{2}z]\hat{z} + [3x^{2}y^{3}z^{3} - 0]\hat{y} + [2xy^{2}z^{2} - 2x]\hat{z} and let that whole thing equal \alpha

I think your second term (the y-hat term) is missing a minus sign here.
 
Hi jeff1evesque! :smile:

(have a del: ∇ and use \cdot, not \bullet:wink:)
jeff1evesque said:
I was wondering if the following are identical,
\nabla \bullet \nabla \times \vec{A} = \nabla \bullet (\nabla \times \vec{A}) ? (#1)

Also, more importantly I was wondering if someone could explain to me why the following is zero for any vector,

\nabla \bullet \nabla \times \vec{A} = 0?

Reasoning:
If we look at \nabla \bullet \nabla \times \vec{A}, Isn't \nabla \bullet \nabla the laplacian, or \nabla^{2}? Is that the reason we cannot perform the operation in equation (#1) above- the laplacian is a function that needs a vector field to be operated on, and there is none. So \nabla \bullet \nabla = 0.

You're very confused. :redface:

Learn the following:

div curl = curl grad = 0

div(curlA) = ∇.(∇xA) = 0

curl(gradφ) = ∇x(∇φ) = 0

div(gradφ) = ∇2φ = Laplacian.

∇.(∇xA) is zero for the same reason that B.(BxA) is zero. :wink:
 
alphysicist said:
I think your second term (the y-hat term) is missing a minus sign here.

The y-hat consists of the following,
[\frac{\partial}{\partial x}(x^{3}y^{3}z^{3}) - \frac{\partial}{\partial z}(2xy)]\hat{y} = [3x^{2}y^{3}z^{3} - 0]\hat{y}

Am I incorrect?
 
jeff1evesque said:
The y-hat consists of the following,
[\frac{\partial}{\partial x}(x^{3}y^{3}z^{3}) - \frac{\partial}{\partial z}(2xy)]\hat{y} = [3x^{2}y^{3}z^{3} - 0]\hat{y}

Am I incorrect?

That is incorrect; it is the negative of that.
 
  • #10
alphysicist said:
That is incorrect; it is the negative of that.

Thanks, I totally forgot (the notation is similar to taking determinants).
 
Back
Top