Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Dot product = Product of norms

  1. Oct 9, 2012 #1
    Hey guys I am a beginner in linear algebra. I am doing vectors now and I just noticed that when two vectors are parallel (or antiparallel), the product of their norms is equal to the absolute value of their dot product, or

    [tex] |u \cdot v | = ||u|| \ ||v|| [/tex]

    I know that this is a special case of the Cauchy-Schwarz inequality. My question is, is the converse necessarily true? In other words, if you know the above equation to be true for a pair of vectors u and v, must they necessarily be parallel? How might one go about proving this? Could we assume the contrary and show an inconsistency?

    By the way, by parallel, I mean to say that two vectors are parallel if there exists a scalar (real number, not necessarily positive) that scales one vector onto the other.

  2. jcsd
  3. Oct 9, 2012 #2


    User Avatar
    Science Advisor

    Hey Bipolarity.

    You can prove this is you want by letting one vector be a scalar multiple of the other (like v = c*u where c is a non-zero real number) and then use the properties of norms and inner products to show that the above is the case.

    For inner products we know that <u,cu> = c<u,u> and for norms we know that ||cu|| = |c|*||u|| and the rest follows from there on.
  4. Oct 9, 2012 #3
    Hey chiro, I am afraid you might have misunderstood my question. I already proved that parallel vectors satisfy the above equation. What I am trying to prove is whether the satisfaction of the above equation necessitates that two vectors be parallel.

    Why would you assume the vectors are parallel if you are trying to prove exactly that?
    Also, I have not studied inner product spaces yet. Do you mean the dot product when you say inner product?

  5. Oct 9, 2012 #4


    User Avatar
    Science Advisor

    Well as you are aware, the Cauchy-Schwartz puts in inequality which means that if it is not at the extent of the inequality, then it must be within its bounds so yes you should be able to show that if the relationship is satisfied then the vectors are parallel.

    In terms of an actual "proof" if you show that there is a bidirectional implication of the two statements, then it means negating that statement means that if one doesn't hold then the other can't either.
  6. Oct 9, 2012 #5
    Yes that is what I am asking. How would I go about proving that the vectors are parallel? I would have to prove the existence of a scalar that scales one vector onto the other. What would that scalar be, given that the equation above is satisfied? Somehow I would have to define this scalar in terms of the dot product and the norms of the vectors. I feel that is the only way to prove this conjecture.

  7. Oct 9, 2012 #6
    Do you know the proof of the Cauchy-Schwarz inequality?? Can you write it down?
    What you want to prove usually follows from a small modification in the proof.
  8. Oct 9, 2012 #7


    User Avatar
    Science Advisor

    I've thought about this, and I think one way is to use projections to re-formulate the problem and then show a bi-directional proof based on the difference between the result of projecting one vector onto another and the original vector itself.

    So if you want to project u onto v then you calculate (<u,v>/||v||)*v^ = (<u,v>/||v||^2)*v.

    Now the difference of the norms of those vectors is ||u|| - ||proj(u,v)|| = x. If x > 0 then these vectors are not parallel (or linearly dependent).

    Now if something is not linearly dependent with something, we need to show that this implies the above statement. So to do this, you should break up a general vector u in terms of a scalar multiple times v plus some residual term. (i.e. u = a*v + w where w is the left-over component).

    Using distributivity of the inner product we get proj(u,v) = (<u,v>/||v||^2)*v = (<a*v + w,v>/||v||^2)*v = (<a*v,v>/||v||^2)*v + (<w,v>/||v||^2)*v = av + proj(w,v).

    So all I have done is extended the above to take into account a vector with a linearly dependent term and a linearly independent term.

    You will have to re-arrange that stuff yourself (I'm just fleshing out the skeleton of the idea) but if you show that linear dependence means you can only get the equality with w = 0, then getting that equality means that w = 0. You suppose that w <> 0 and end up with a contradiction to show bi-directionality.

    The projection part might be a good catalyst to consider since it gives specific indicators of how <u,v> is related to the dependent and independent parts explicitly and you can relate these ideas to Cauchy-Schwartz as well.
  9. Oct 10, 2012 #8
    Thanks micromass! I remember my professor showing us the proof, and I remember having a notion of solving this problem using that proof. Now I know that it can be done.

    I will try to apply the Cauchy-Schwarz inequality, however the proof is quite lengthy for me to type it up on LaTeX here.

    Thanks for the help!

  10. Oct 10, 2012 #9


    User Avatar
    Science Advisor

    One can show that the dot product of two vectors, u and v, is [itex]u\cdot v=|u||v|cos(\theta)[/itex] where [itex]\theta[/itex] is the angle between the two vectors. (one can show that in two or three dimensions. In higher dimensional vector spaces, that is typically used as the definition of "angle".) That is, if [itex]|u\cdot v|= |u||v|[/itex] then we must have [itex]cos(\theta)[/itex] equal to 1 or -1. Which in turn means that the angle between u and v is either 0 or 180 degrees.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook