Originally posted by saski
But I can quote MTW: "Contraction seals off two of the tensor's slots, reducing the rank by two." That has to include the contraction of the tensor product of a contravariant and a covariant vector. If we're not to call that a contraction, what should we call it?
contraction refers to what you do when you get rid of an index in tensor index notation by summing over one of them.
i have never ever heard this term applied to Hilbert spaces. i believe the reason is because no one uses indices to label the states of their Hilbert space (which would be a very ackward notation indeed if it is not finite)
"Mathematicians term <B|A> the inner product of a bra and a ket." Bras are defined as linear functionals operating on kets, i.e. dual vectors. So <B|A> is a contraction by MTW's definition. Howvever, t there's a norm on the Hilbert space allowing one to convert between bras and kets, so <B|A> is also the inner product of |A> and |B>.
well, i am not a mathematician, so perhaps i shouldn't speak for them, but as far as i can tell, mathematicians do not use bra ket notation at all, because it is extremely sloppy.
and every math book i know defines an inner product as a positive definite bilinear (or perhaps antilinear in one argument) form. some more physically minded texts allow for more general nondegenerate (instead of positive definite)
but the point is, it is an operation on two
VECTORS
Consider:
w_i = \epsilon_i_j_k u^j v^k
It becomes a vector product only by raising the index on w, which requires a metric, i.e. definition of orthogonality. Or you can write the exterior product:
u^j v^k - v^j u^k
but you need a Hodge star to make a vector out of it, and you need the metric for the Hodge star.
when you say "metric", do you mean Riemannian metric?
anyway, which point are you trying to prove here? that you need orthogonality to define the cross product? i am a little lost with what you are trying to show here, but i have a strong suspicion that it is wrong.
The Lie bracket expresses non-commutation of Lie derivative operators; it's not a simple matter of alternating tensor products, and it's certainly not the same thing as a vector product.
perhaps you should review the definition of Lie algebra.
want me to tell you? ok:
firstly, a Lie algebra is an algebra, which means it is a vector space with a vector product.
this vector product is bilinear (as all products must be), but neither commutative nor associative.
perhaps you can tell me your definition of vector product, so we can make sure we both know what the other is talking about. i told you mine: a product which is vector valued. the Lie bracket
certainly satisfies this requirement. if you think otherwise, you are just wrong.
And matrix multiplication is entirely the multiplication of row with column vectors. Again, not a vector product.
I stand by what I said.
see above. please tell me your definition of vector product. if you define the vector product to be the cross product in R
3, then of course anything else will not be.
However, I've permitted confusion between that exterior product and the Clifford product.
What the Clifford people do is make a grand basis containing scalar unity, the unit vectors, unit bivectors, unit trivectors, etc; the whole graded sequence of exterior products. Then they define a super-product on the span of that, which they call the "associative" OR "geometric" OR "Clifford" product.
yeah, i know what a Clifford algebra is, but thanks.
according to that link, geometric algebra people call the exterior product the outer product. OK, although this is different from your previous answer, i find it more plausible, so i will accept this answer.
and so according to your quote above, Grassman invented the Grassman algebra, aka, the exterior algebra. this is exactly what i claimed.