Proving the Dot Product Identity with Hint for (i) and (ii)

  • Thread starter Thread starter Shackleford
  • Start date Start date
Click For Summary
SUMMARY

The discussion focuses on proving the vector identity (i) (a × b) · (c × d) = (a · c)(b · d) − (a · d)(b · c) using the identity (ii) a × (b × c) = (a · c) b − (a · b) c. Participants explore various methods to derive (i) from (ii), emphasizing the importance of manipulating the scalar product and recognizing that vector identities hold for arbitrary vectors. The key insight is that renaming vectors allows for a straightforward derivation of (i) from (ii).

PREREQUISITES
  • Understanding of vector operations, specifically cross and dot products.
  • Familiarity with vector identities and their applications.
  • Knowledge of scalar triple products and their properties.
  • Ability to manipulate algebraic expressions involving vectors.
NEXT STEPS
  • Study the derivation of the scalar triple product and its implications in vector calculus.
  • Learn about vector identities and their proofs in linear algebra.
  • Explore applications of cross and dot products in physics and engineering.
  • Practice problems involving the manipulation of vector identities to strengthen understanding.
USEFUL FOR

Students of mathematics and physics, particularly those studying vector calculus, as well as educators looking for examples of vector identity proofs.

Shackleford
Messages
1,649
Reaction score
2
(i) (a × b) · (c × d) = (a (i) · c)(b · d) − (a · d)(b · c).
(ii) a × (b × c) = (a · c) b − (a · b) c.

(a) Show that (i) can be derived directly from (ii). Hint: Dot (ii) with d and use the
fact that (a × b) · c = (c × a) · b, etc.

I worked on this every way I knew how last night, but I couldn't get it to work. I'm pretty sure it's something simple that I'm missing.

I've set u = (a x b), u = (c x d), etc.
 
Physics news on Phys.org
Did you try the hint? Where did that lead you?


(where did u come from? :confused:)
 
Hurkyl said:
Did you try the hint? Where did that lead you?


(where did u come from? :confused:)

Yeah, I tried the hint. I substituted u to put it in a triple scalar form. As far as I know, you can only manipulate three distinct vectors in the triple scalar. The (b x c) is a particular vector. That's why I substituted u in.
 
It might be easiest for you if your first step is to use (ii) to calculate (\textbf{a}\times\textbf{b})\times\textbf{c}
 
gabbagabbahey said:
It might be easiest for you if your first step is to use (ii) to calculate (\textbf{a}\times\textbf{b})\times\textbf{c}

Yeah, (a x b) x c = - c x (a x b). Dotting that identity with d explicitly gives you (i).

I tried to get (i) directly from (ii) by dotting a x (b x c) with d and manipulating the scalar product form. However, it seems that you're supposed to merely use the form of (ii) and its known identity to derive (i), not necessarily stick with the given (ii). Is that correct? Is it even possible to derive (i) the way I was going at it?
 
Shackleford said:
Yeah, I tried the hint. I substituted u to put it in a triple scalar form.
Well, what did you get? And did you try the other part of the hint, to rotate the factors of the triple?
 
Hurkyl said:
Well, what did you get? And did you try the other part of the hint, to rotate the factors of the triple?

I didn't get (i).

Yes. I rotated until I got back to where I started. I'm at work right now, and all my scratch work is at the house. I tried all variations of the triple scalar product.
 
(a x (b x c)) . d = (d x a) . (b x c)

(d x a) . (b x c) = c . ((d x a) x b) = b . (c x (d x a))

etc.
 
Shackleford said:
Is it even possible to derive (i) the way I was going at it?

You probably did derive (i) with your first attempt, you just didn't realize it.

The identity (\textbf{d}\times\textbf{a})\cdot(\textbf{b}\times\textbf{c})=(\textbf{a}\cdot\textbf{c})(\textbf{b}\cdot\textbf{d})-(\textbf{a}\cdot\textbf{b})(\textbf{c}\cdot\textbf{d}), is exactly the same thing as the identity (\textbf{a}\times\textbf{b})\cdot(\textbf{c}\times\textbf{d})=(\textbf{a}\cdot\textbf{c})(\textbf{b}\cdot\textbf{d})-(\textbf{a}\cdot\textbf{d})(\textbf{b}\cdot\textbf{c})...you simply have to rename your vectors \textbf{a}\to\textbf{b}, \textbf{b}\to\textbf{c}, \textbf{c}\to\textbf{d} and \textbf{d}\to\textbf{a}.

Remember, vector identities are true for arbitrary vectors, so it doesn't matter how you name them.
 
  • #10
gabbagabbahey said:
You probably did derive (i) with your first attempt, you just didn't realize it.

The identity (\textbf{d}\times\textbf{a})\cdot(\textbf{b}\times\textbf{c})=(\textbf{a}\cdot\textbf{c})(\textbf{b}\cdot\textbf{d})-(\textbf{a}\cdot\textbf{b})(\textbf{c}\cdot\textbf{d}), is exactly the same thing as the identity (\textbf{a}\times\textbf{b})\cdot(\textbf{c}\times\textbf{d})=(\textbf{a}\cdot\textbf{c})(\textbf{b}\cdot\textbf{d})-(\textbf{a}\cdot\textbf{d})(\textbf{b}\cdot\textbf{c})...you simply have to rename your vectors \textbf{a}\to\textbf{b}, \textbf{b}\to\textbf{c}, \textbf{c}\to\textbf{d} and \textbf{d}\to\textbf{a}.

Remember, vector identities are true for arbitrary vectors, so it doesn't matter how you name them.

And I was ready to throw in the towel, quit my major and minor and get a liberal arts degree. Well, that's just great. I knew it had to be something silly I was overlooking. The thought of renaming the vectors DID occur to me, but I didn't follow through with it. I spent an embarrassing amount of time on that problem manipulating the scalar product (ii) as written trying to get (i) as written. So, all I had to do was get the form of (i) FROM (ii) and change the vectors in the (ii) derivation to match the vectors in (i).
 
  • #11
Well, I won't take it back yet. I really suck at this.

6. Recall the identities already established.
(i) (a × b) · (c × d) = (a · c)(b · d) − (a · d)(b · c).
(ii) a × (b × c) = (a · c) b − (a · b) c.

(a) Show that (i) can be derived directly from (ii). Hint: Dot (ii) with d and use the
fact that (a × b) · c = (c × a) · b, etc. (b) Also show that (ii) can be derived directly
from (i). Hint: When (b × c) not = 0 (otherwise the result is obvious) it is easily seen that
a × (b × c) = αb + βc for some scalars α and β . Now, dot this with d, use identity (i),
and finally use the fact that the vector d is arbitrary. (Either part of this exercise would
make a pretty good exam question.)

a × (b × c) = αb + βc

(αb + βc) · d = (αb · d) + (βc · d)
 
  • #12
Shackleford said:
a × (b × c) = αb + βc

(αb + βc) · d = (αb · d) + (βc · d)

Okay, so combining these two lines you have [\textbf{a}\times(\textbf{b}\times\textbf{c})]\cdot\textbf{d}=\alpha(\textbf{b}\cdot\textbf{d})+\beta(\textbf{c}\cdot\textbf{d}); now use the scalar triple product rule on the LHS to get something you can compare to (i).
 
  • #13
gabbagabbahey said:
Okay, so combining these two lines you have [\textbf{a}\times(\textbf{b}\times\textbf{c})]\cdot\textbf{d}=\alpha(\textbf{b}\cdot\textbf{d})+\beta(\textbf{c}\cdot\textbf{d}); now use the scalar triple product rule on the LHS to get something you can compare to (i).

I would imagine on the LHS I do what I did earlier??

(d x a) · (b x c) = α(b · d) + β(c · d).

If I do that, then I guess I could do

d to a
a to b
b to c
c to d

then α = (a · c), β = (b · c). IDK. I'm getting tired.
 
Last edited:
  • #14
Shackleford said:
I would imagine on the LHS I do what I did earlier??

(d x a) · (b x c) = α(b · d) + β(c · d)

Right, and what does (i) tell you the LHS of this equation must be?
 
  • #15
gabbagabbahey said:
Right, and what does (i) tell you the LHS of this equation must be?

Did I get it right in the edit to my previous post?
 
  • #16
Shackleford said:
Did I get it right in the edit to my previous post?

Close, but not quite. Try this one step at a time...what do you get when you apply (i) to (\textbf{d}\times\textbf{a})\cdot(\textbf{b}\times\textbf{c})?
 
  • #17
gabbagabbahey said:
Close, but not quite. Try this one step at a time...what do you get when you apply (i) to (\textbf{d}\times\textbf{a})\cdot(\textbf{b}\times\textbf{c})?

Well, correspondingly,

(d x a) · (b x c) = (d · b)(a · c) - (d · c)(a · b) = α(b · d) + β(c · d)

(d · b)(a · c) - (d · c)(a · b) = α(b · d) + β(c · d)

α = (a · c)

β = -(a · b)
 

Similar threads

Replies
3
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
4
Views
2K
  • · Replies 16 ·
Replies
16
Views
6K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 1 ·
Replies
1
Views
1K