Are Vectors Parallel If Dot Product Equals Product of Norms?

Click For Summary
SUMMARY

The discussion centers on the relationship between the dot product and the norms of vectors in linear algebra, specifically whether the equality |u · v| = ||u|| ||v|| implies that vectors u and v are parallel. Participants confirm that this equality is a consequence of the Cauchy-Schwarz inequality and explore methods to prove the converse. They suggest using projections and the properties of inner products to establish a bidirectional implication, ultimately concluding that if the equality holds, the vectors must indeed be parallel.

PREREQUISITES
  • Understanding of vector norms and dot products
  • Familiarity with the Cauchy-Schwarz inequality
  • Basic knowledge of linear dependence and independence
  • Concept of vector projections in linear algebra
NEXT STEPS
  • Study the proof of the Cauchy-Schwarz inequality in detail
  • Learn about vector projections and their applications
  • Explore linear dependence and independence in vector spaces
  • Investigate the implications of the dot product in higher-dimensional spaces
USEFUL FOR

Students and educators in linear algebra, mathematicians interested in vector analysis, and anyone looking to deepen their understanding of the properties of vectors and their relationships.

Bipolarity
Messages
773
Reaction score
2
Hey guys I am a beginner in linear algebra. I am doing vectors now and I just noticed that when two vectors are parallel (or antiparallel), the product of their norms is equal to the absolute value of their dot product, or

|u \cdot v | = ||u|| \ ||v||

I know that this is a special case of the Cauchy-Schwarz inequality. My question is, is the converse necessarily true? In other words, if you know the above equation to be true for a pair of vectors u and v, must they necessarily be parallel? How might one go about proving this? Could we assume the contrary and show an inconsistency?

By the way, by parallel, I mean to say that two vectors are parallel if there exists a scalar (real number, not necessarily positive) that scales one vector onto the other.

BiP
 
Physics news on Phys.org
Hey Bipolarity.

You can prove this is you want by letting one vector be a scalar multiple of the other (like v = c*u where c is a non-zero real number) and then use the properties of norms and inner products to show that the above is the case.

For inner products we know that <u,cu> = c<u,u> and for norms we know that ||cu|| = |c|*||u|| and the rest follows from there on.
 
chiro said:
Hey Bipolarity.

You can prove this is you want by letting one vector be a scalar multiple of the other (like v = c*u where c is a non-zero real number) and then use the properties of norms and inner products to show that the above is the case.

For inner products we know that <u,cu> = c<u,u> and for norms we know that ||cu|| = |c|*||u|| and the rest follows from there on.

Hey chiro, I am afraid you might have misunderstood my question. I already proved that parallel vectors satisfy the above equation. What I am trying to prove is whether the satisfaction of the above equation necessitates that two vectors be parallel.

Why would you assume the vectors are parallel if you are trying to prove exactly that?
Also, I have not studied inner product spaces yet. Do you mean the dot product when you say inner product?

BiP
 
Well as you are aware, the Cauchy-Schwartz puts in inequality which means that if it is not at the extent of the inequality, then it must be within its bounds so yes you should be able to show that if the relationship is satisfied then the vectors are parallel.

In terms of an actual "proof" if you show that there is a bidirectional implication of the two statements, then it means negating that statement means that if one doesn't hold then the other can't either.
 
chiro said:
Well as you are aware, the Cauchy-Schwartz puts in inequality which means that if it is not at the extent of the inequality, then it must be within its bounds so yes you should be able to show that if the relationship is satisfied then the vectors are parallel.

In terms of an actual "proof" if you show that there is a bidirectional implication of the two statements, then it means negating that statement means that if one doesn't hold then the other can't either.

Yes that is what I am asking. How would I go about proving that the vectors are parallel? I would have to prove the existence of a scalar that scales one vector onto the other. What would that scalar be, given that the equation above is satisfied? Somehow I would have to define this scalar in terms of the dot product and the norms of the vectors. I feel that is the only way to prove this conjecture.

BiP
 
Do you know the proof of the Cauchy-Schwarz inequality?? Can you write it down?
What you want to prove usually follows from a small modification in the proof.
 
I've thought about this, and I think one way is to use projections to re-formulate the problem and then show a bi-directional proof based on the difference between the result of projecting one vector onto another and the original vector itself.

So if you want to project u onto v then you calculate (<u,v>/||v||)*v^ = (<u,v>/||v||^2)*v.

Now the difference of the norms of those vectors is ||u|| - ||proj(u,v)|| = x. If x > 0 then these vectors are not parallel (or linearly dependent).

Now if something is not linearly dependent with something, we need to show that this implies the above statement. So to do this, you should break up a general vector u in terms of a scalar multiple times v plus some residual term. (i.e. u = a*v + w where w is the left-over component).

Using distributivity of the inner product we get proj(u,v) = (<u,v>/||v||^2)*v = (<a*v + w,v>/||v||^2)*v = (<a*v,v>/||v||^2)*v + (<w,v>/||v||^2)*v = av + proj(w,v).

So all I have done is extended the above to take into account a vector with a linearly dependent term and a linearly independent term.

You will have to re-arrange that stuff yourself (I'm just fleshing out the skeleton of the idea) but if you show that linear dependence means you can only get the equality with w = 0, then getting that equality means that w = 0. You suppose that w <> 0 and end up with a contradiction to show bi-directionality.

The projection part might be a good catalyst to consider since it gives specific indicators of how <u,v> is related to the dependent and independent parts explicitly and you can relate these ideas to Cauchy-Schwartz as well.
 
micromass said:
Do you know the proof of the Cauchy-Schwarz inequality?? Can you write it down?
What you want to prove usually follows from a small modification in the proof.

Thanks micromass! I remember my professor showing us the proof, and I remember having a notion of solving this problem using that proof. Now I know that it can be done.

I will try to apply the Cauchy-Schwarz inequality, however the proof is quite lengthy for me to type it up on LaTeX here.

Thanks for the help!

BiP
 
One can show that the dot product of two vectors, u and v, is u\cdot v=|u||v|cos(\theta) where \theta is the angle between the two vectors. (one can show that in two or three dimensions. In higher dimensional vector spaces, that is typically used as the definition of "angle".) That is, if |u\cdot v|= |u||v| then we must have cos(\theta) equal to 1 or -1. Which in turn means that the angle between u and v is either 0 or 180 degrees.
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 33 ·
2
Replies
33
Views
4K
  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 16 ·
Replies
16
Views
3K
  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 32 ·
2
Replies
32
Views
4K
  • · Replies 18 ·
Replies
18
Views
2K