Why Does the Second Equality Hold in Multi-Linear Algebra?

brydustin
Messages
201
Reaction score
0
{(a_i)_j} is the dual basis to the basis {(e_i)_j}
I want to show that
((a_i)_1) \wedge (a_i)_2 \wedge... \wedge (a_i)_n ((e_i)_1,(e_i)_2,...,(e_i)_n) = 1

this is exercise 4.1(a) from Spivak. So my approach was:

\BigWedge_ L=1^k (a_i)_L ((e_i)_1,...,(e_i)_n) = k! Alt(\BigCross_L=1^k (a_i)_L)((e_i)_1,...,(e_i)_n)= k! Alt(T)((e_i)_1,...,(e_i)_n) = k!(1/k! Sum _ {permutations σ} sgn σ T ((e_i)_σ (1),...,(e_i)_σ (n))

where T = \BigCross_L=1^k (a_i)_L

So there is already a result on what T ((e_i)_1,...,(e_i)_n) is. 1 if all the sub-indices agree, and 0 otherwise. My question is... is T ((e_i)_σ (1),...,(e_i)_σ (n)) any different?

I'm assuming that in the one dimensional case we would say that T acts on one element in a linear fashion... but I'm kinda confused by the idea of having several arguments...

Otherwise,...is there an easier approach to the solution?
 
Physics news on Phys.org
Okay... I have since figured out the solution ... the real question then becomes why is the second equals sign true (below):

×_L=1^k (a_i)_L ((e_i)_σ(1),... (e_i)_σ(k)) = ∏_L=1^k ((a_i)_L)(e_i)_σ(L) = 1 if and only if is identity and 0 otherwise.

where × denotes multiple(indexed) tensor products. And if this second equals sign is true then can I have this view for all tensor products? Namely, is a tensor product of k-parts operating on k-arguments equal to the product of each "part" acting on its corresponding argument (with the same index)?
Can I always hold that view of a tensor? Are there tensors where this is more obvious and others ... not so much?
 
Back
Top