What are some useful vector calculus identities for vector fields A and B?

Click For Summary

Homework Help Overview

The discussion revolves around proving a vector calculus identity involving arbitrary vector fields A and B, specifically the expression ∇.(A ∧ B) = B.(∇∧A) - A.(∇∧B). The subject area is vector calculus, focusing on identities related to vector fields and their operations.

Discussion Character

  • Exploratory, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • Participants discuss various methods to approach the proof, including expanding both sides of the identity and considering specific components. Some express challenges with suffix notation and seek alternative methods. Others mention the use of the Levi-Civita symbol and the Leibniz rule in their reasoning.

Discussion Status

The conversation is active, with participants sharing their attempts and insights. Some have successfully completed parts of the identity while others are clarifying concepts and addressing confusion about summation indices and differentiation. There is a collaborative effort to guide each other through the reasoning process without reaching a final consensus.

Contextual Notes

Participants note constraints such as the requirement to avoid suffix notation and the complexity of the calculations involved. There is also mention of the need to keep track of indices during the derivation process.

hhhmortal
Messages
175
Reaction score
0

Homework Statement



For arbitrary vector fields A and B show that:


∇.(A ∧ B) = B.(∇∧A) - A.(∇∧B)





The Attempt at a Solution



I considered only the 'i'-axis, by saying that it is perpendicular with A and B and then I expanded both the left and right side out. The working is too much to post here..I didn't manage to prove it. I was hoping someone would know if there's a useful webpage where I can find out more about vector calculus identities.

Thanks!
 
Physics news on Phys.org
You can use the fact that

[tex][\mathbf A \times \mathbf B]_i = \epsilon_{ijk} A_j B_k[/tex]

Where there is a summation over repeated indices and [tex]\epsilon_{ijk}[/tex] is completely antisymmetric under an exchange of indices (Levi-Civita symol). That way it's not difficult to prove. Don't know about web sites though.
 
Yea, although I'm being asked to prove it without using suffix notation..which is much more tedious
 
It is, but it's still a straightforward calculation, just remember the Leibniz rule and keep track of the indices.
 
Ok I completed the identity without using suffix notation, but now I will to use it:

My working out is the following:

I take the ith component of both sides, I first start with the left side:[tex]\epsilon[/tex]ijk δ/δx[tex]_{}i[/tex](A_j B_k)And then for the right side:

(B_i ε_ijk δA_k/δx[tex]_{}j[/tex]) - (A_i ε_ijk δB_k/δx[tex]_{}j[/tex])

As you can see the differentials are different, for the left side it is w.r.t the 'i' component while for the right side it is w.r.t 'j' component
 
Last edited:
There are no components on either side, the expression is a scalar since it's a dot product. There is a summation over i,j and k so the left hand side is

[tex]\nabla \cdot (\mathbf A \times \mathbf B) = \sum_{i=1}^3 \partial_i (\mathbf A \times \mathbf B)_i = \sum_{i=1}^3 \partial_i (\sum_{j,k=1}^3\epsilon_{ijk} A_j B_k) = \sum_{i,j,k=1}^3 \epsilon_{ijk} \partial_i (A_j B_k)[/tex]

Calculate the derivative and rearrange the terms to get the expression on the right hand side (Hint: use the fact that epsilon is antisymmetric under an exchange of indices).
 
phsopher said:
There are no components on either side, the expression is a scalar since it's a dot product. There is a summation over i,j and k so the left hand side is

[tex]\nabla \cdot (\mathbf A \times \mathbf B) = \sum_{i=1}^3 \partial_i (\mathbf A \times \mathbf B)_i = \sum_{i=1}^3 \partial_i (\sum_{j,k=1}^3\epsilon_{ijk} A_j B_k) = \sum_{i,j,k=1}^3 \epsilon_{ijk} \partial_i (A_j B_k)[/tex]

Calculate the derivative and rearrange the terms to get the expression on the right hand side (Hint: use the fact that epsilon is antisymmetric under an exchange of indices).

Oh ok! why is k=1 though? I am a bit confused. So you have summed the left side over i, will this mean :

(A_j )(B_z) will be differentiated w.r.t to dx_i and then I can use the product rule.
 
No, the summation is over all three indices, it has to be because the end result is a scalar so there can't be any indices remaining.What the above means is that i goes from 1 to 3, j goes from 1 to 3 and k goes from 1 to 3.

In any case, yes, differentiate w.r.t dx_i inside the sum and you will get two terms. Rearrange them using the antisymmetricity of epsilon and you will get the two terms of the right hand side.
 

Similar threads

  • · Replies 9 ·
Replies
9
Views
3K
Replies
4
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
Replies
9
Views
2K
Replies
20
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K