farleyknight
- 143
- 0
This isn't homework, I just want to get some intuition about something.
So I'm looking at the definition, along with some examples, of http://en.wikipedia.org/wiki/Inner_product_space" . It seems to me that the five examples they list are just special cases of summations. The dot product is obviously a summation. The complex vector version of inner product becomes a summation via matrix multiplication. The integral used on Hilbert Spaces is another version of (continuous) summation. Expectation of random variables and the trace of the product of two matrices are both just summations.
So are all inner products just a "summation", in a sense?
My main curiosity is actually how would one prove the Cauchy-Swartz inequality in any given inner product space. Does it usually involve boiling it all down to summations, or are there other standard tricks involved?
So I'm looking at the definition, along with some examples, of http://en.wikipedia.org/wiki/Inner_product_space" . It seems to me that the five examples they list are just special cases of summations. The dot product is obviously a summation. The complex vector version of inner product becomes a summation via matrix multiplication. The integral used on Hilbert Spaces is another version of (continuous) summation. Expectation of random variables and the trace of the product of two matrices are both just summations.
So are all inner products just a "summation", in a sense?
My main curiosity is actually how would one prove the Cauchy-Swartz inequality in any given inner product space. Does it usually involve boiling it all down to summations, or are there other standard tricks involved?
Last edited by a moderator: