Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Cauchy schwarz inequality in Rudin

  1. Mar 30, 2013 #1
    I have worked my way though the proof of the Cauchy Schwarz inequality in Rudin but I am struggling to understand how one could have arrived at that proof in the first place. The essence of the proof is that this sum:
    ##\sum |B a_j - C b_j|^2##
    is shown to be equivalent to the following expression:
    ##B(AB - |C|^2)##
    Now since each term of the first sum is positive, it is clearly greater than or equal to zero, so that the expression $$B(AB - |C|^2)$$ is also greater than or equal to zero. Now if $$B = 0$$ the theorem is trivial, so assume that $$B \geq 0$$ and then the inequality $$B(AB - |C|^2) \geq 0$$ implies that $$AB - |C|^2 \geq 0$$ which is the theorem.

    Now naturally what I want to understand is how to arrive at this proof in the first place. Some intuition to start with is that if $$AB - |C|^2$$ can be made equivalent to a single sum, each term of which is nonnegative, this would give the desired result. But Rudin added a step to this, by showing that $$B(AB - |C|^2)$$ can be made equivalent to a single sum and then the B can be cancelled out. What train of thought would have led Rudin to this proof?

    There is an explanation offered here:
    http://math.berkeley.edu/~gbergman/ug.hndts/06x2+03F_104_q+a.txt

    But I am still struggling to figure out that explanation too. Can anyone either help or direct me to a useful resource?

    Thanks!
     
  2. jcsd
  3. Mar 30, 2013 #2
    IMHO, I think that rudin may have proven this in this way b/c he found it to be more elegant. When I took analysis, I proved it in the following way... Which is much more of a derivation than a proof , loosely... The cs inequality is True in a vector space, so... Given two vectors a and b, the derivation of the the formula is an obvious result of the dot product of two vectors in r^n..... which can be seen as an obvious result of the law of cosines applied in a vector space and so on... I too wanted more solid analytical proof than that, but after following the advice of the text and prof, instead of letting the switch from seeing n-tuples as vectors , or points in space confuse me... I embraced the ability to effortlessly go back and forth between seeing n-tuples as vectors or points in n space.... For any point p in n space... We may assign an n dimensional vector op.... Conversely for any vector op in n space, we may assign a point p.... By going effortlessly back and forth between mindsets one sees that a proof of the cs inequality in a vector sense is truly sufficient, in fact, you probally won't take the time to prove it at all, it is obviously true.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Cauchy schwarz inequality in Rudin
  1. A inequality (Replies: 1)

Loading...