Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Are all inner products just summations?

  1. Sep 3, 2010 #1
    This isn't homework, I just want to get some intuition about something.

    So I'm looking at the definition, along with some examples, of http://en.wikipedia.org/wiki/Inner_product_space" [Broken]. It seems to me that the five examples they list are just special cases of summations. The dot product is obviously a summation. The complex vector version of inner product becomes a summation via matrix multiplication. The integral used on Hilbert Spaces is another version of (continuous) summation. Expectation of random variables and the trace of the product of two matrices are both just summations.

    So are all inner products just a "summation", in a sense?

    My main curiosity is actually how would one prove the Cauchy-Swartz inequality in any given inner product space. Does it usually involve boiling it all down to summations, or are there other standard tricks involved?
    Last edited by a moderator: May 4, 2017
  2. jcsd
  3. Sep 3, 2010 #2


    User Avatar
    Science Advisor
    Homework Helper

    The proof of Cauchy-Schwarz does involve a trick. But it doesn't use the fact that that the inner product is defined as a form of summation. It looks the same in any inner product space.
  4. Sep 3, 2010 #3
    Unfortunately, immediately after I posted, I scrolled further down the Wikipedia page and I think I saw the trick you are talking about. :(

    Are there other inequalities (I know of the triangle inequality off hand) that are useful in continuity proofs? I'm looking to get some more practice with this area. Most of my day dream math is usually abstract algebra, so I'm not as familiar with this topic as I ought to be..
  5. Sep 3, 2010 #4
    Personally, I do not know of an inner product that couldn't be "abstractly" characterized as a sum. For example, inner products on function spaces are often integrals, which some might argue are infinite sums. A common inner product on matrix spaces is the trace-inner product which again can be written in terms of a sum.

    Probably the reason why sums are so common when dealing with inner products is that one inherently has "addition" on the vector space. Since the inner product must itself be conjugate linear in its arguments, that linearity often manifests itself in terms of a summation on the underlying field.

    So if you don't count integrals as sums, there's an example for you. Unfortunately though, if your question is "is there an inner-product on a vector space that does not make use of the underlying field addition", that's too advanced a question for me to answer.
  6. Sep 3, 2010 #5


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    If you write things in the correct basis then yes, but not really in the way that you are describing it.

    Let's look at just the finite dimensional case. You have some inner product <,> and a vector space V of dimension n. Start by picking any vector, which we will call e1 such that <e1,e1>=1 (we know we can do this because if <v,v>=x, then <v/sqrt(x),v/sqrt(x)>=x/x=1) If U is the set of vectors orthogonal to e1, we can pick a vector e2 from U such that <e2, e2>=1. Then if W is the set of vectors orthogonal to both e1 and e2 we can pick a vector e3 from W whose inner product with itself is 1 also.

    Eventually we've picked vectors e1,...,en which are mutually orthogonal and each have <ei,ei>=1. If you write out a vector v as [tex]v=\sum_i <v, e_i> e_i[/tex] in the coordinates of this basis, then one can see that [tex]<v,u>=\sum v_i u_i[/tex] where [tex]v_i=<v,e_i>[/tex].

    So basically every inner product is just the dot product with respect to some basis.

    This isn't really the right way to approach the Cauchy-Schwartz inequality though
  7. Sep 3, 2010 #6
    Office Shredder:

    Have you not implicitly assumed that your vector space is of countable dimension? I don't think it needs to be separable let alone countable.
  8. Sep 3, 2010 #7


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    So that probably means it's countable

    However, Zorn's lemma guarantees you can find some (possibly uncountable) orthonormal basis (in which you allow infinite series as well as finite sums of basis vectors
  9. Sep 3, 2010 #8
    Ah yes, I must have missed that. Mea Culpa.
  10. Sep 3, 2010 #9


    User Avatar
    Science Advisor

    For example, the set of all functions, defined on interval [a, b] such that [itex]\int_a^b f^2(x) dx[/itex] is a vector space on which an inner product (dot product) can be defined as [itex]\int_a^b f(x)g(x)dx[/itex].
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook