Tensor Product, Basis Vectors and Tensor Components

In summary, the conversation discusses the concept of tensor product, which is a mathematical operation used to combine two vectors, matrices, or higher-dimensional arrays into a new tensor. Basis vectors are also mentioned, which are linearly independent vectors used to represent any vector in a given vector space and form the building blocks for more complex tensors. These basis vectors are used to define the coordinate system in which tensor components are measured. The difference between covariant and contravariant tensor components is also explained, with covariant components being measured with respect to the basis vectors and contravariant components being measured with respect to the dual basis vectors. Finally, the process of calculating tensor components using basis vectors is discussed, which involves projecting the tensor onto the basis vectors in
  • #1
nigelscott
135
4
I am trying to figure how to get 1. from 2. and vice versa where the e's are bases for the vector space and θ's are bases for the dual vector space.

1. T = Tμνσρ(eμ ⊗ eν ⊗ θσ ⊗ θρ)

2. Tμνσρ = T(θμν,eσ,eρ)

My attempt is as follows:

2. into 1. gives T = T(θμν,eσ,eρ)(eμ ⊗ eν ⊗ θσ ⊗ θρ)

Now if I assume that (θμν,eσ,eρ) Ξ (θμ ⊗ θν ⊗ eσ ⊗ eρ) this becomes:

T = T(θμ ⊗ θν ⊗ eσ ⊗ eρ)(eμ ⊗ eν ⊗ eσ ⊗ θρ)

= θμeμ ⊗ θνeν ⊗ eσθσ ⊗ eρθρ

Now using θνeμ = δνμ this becomes:

T = T(I ⊗ I ⊗ I ⊗ I)

So T = T

This seems to work but I'm not sure if this is the correct way to do it. I'm shaky on the tensor product stuff and my interpretation of T(_,_,_,_). Does this look right?
 
Physics news on Phys.org
  • #2
nigelscott said:
I am trying to figure how to get 1. from 2. and vice versa where the e's are bases for the vector space and θ's are bases for the dual vector space.

1. T = Tμνσρ(eμ ⊗ eν ⊗ θσ ⊗ θρ)

2. Tμνσρ = T(θμν,eσ,eρ)

My attempt is as follows:

2. into 1. gives T = T(θμν,eσ,eρ)(eμ ⊗ eν ⊗ θσ ⊗ θρ)

Now if I assume that (θμν,eσ,eρ) Ξ (θμ ⊗ θν ⊗ eσ ⊗ eρ) this becomes:

T = T(θμ ⊗ θν ⊗ eσ ⊗ eρ)(eμ ⊗ eν ⊗ eσ ⊗ θρ)

= θμeμ ⊗ θνeν ⊗ eσθσ ⊗ eρθρ

Now using θνeμ = δνμ this becomes:

T = T(I ⊗ I ⊗ I ⊗ I)

So T = T

This seems to work but I'm not sure if this is the correct way to do it. I'm shaky on the tensor product stuff and my interpretation of T(_,_,_,_). Does this look right?

It looks all right to me!

Let's try to intuitively understand what you did.

##T## is the generalization of a vector, in the sense that, ##T## is simply the sum of a bunch of components ##{T^{\mu\nu}}_{\sigma\rho}## multiplied by basis vectors ##e_{\mu} \otimes e_{\nu} \otimes \theta^{\sigma}\otimes \theta^{\rho}##. This is the interpretation of equation ##1## in your post.

Therefore, in order to get the component ##{T^{\mu\nu}}_{\sigma\rho}##, you would naively want to multiply the basis vector ##e_{\mu} \otimes e_{\nu} \otimes \theta^{\sigma}\otimes \theta^{\rho}## with itself. But then, you realize that ##T## exists not in flat space, but on a curved manifold. Therefore, you multiply the basis vector ##e_{\mu} \otimes e_{\nu} \otimes \theta^{\sigma}\otimes \theta^{\rho}## not with itself, but by its dual basis vector ##\theta^{\mu} \otimes \theta^{\nu} \otimes e_{\sigma} \otimes e_{\rho}##. That's exactly the interpretation of equation ##2## in your post.

Your check of the consistency of equations ##1## and ##2## is simply a mathematical way of rewriting my above two paragraphs.
 
  • #3
Let me know if my answer is clear, or if there's anything that you would need clarification with.
 

1. What is a tensor product?

The tensor product is a mathematical operation that combines two vectors, matrices, or higher-dimensional arrays to create a new tensor. It is often used in physics and engineering to represent physical quantities, such as forces or moments, that have both magnitude and direction.

2. What are basis vectors?

Basis vectors are a set of linearly independent vectors that can be used to represent any vector in a given vector space. They form the building blocks for more complex vectors and tensors, and are often chosen to simplify calculations and analyses.

3. How do basis vectors relate to tensor components?

In a tensor, the basis vectors are used to define the coordinate system in which the tensor components are measured. This means that the components of a tensor can vary depending on the choice of basis vectors used, but the underlying physical quantity represented by the tensor remains the same.

4. What is the difference between covariant and contravariant tensor components?

Covariant tensor components are measured with respect to the basis vectors, while contravariant components are measured with respect to the dual basis vectors. This difference is important when transforming tensors between different coordinate systems and can affect the physical interpretation of the tensor.

5. How are tensor components calculated using basis vectors?

The components of a tensor can be calculated by projecting the tensor onto the basis vectors. This involves taking the dot product of the tensor with each basis vector in the coordinate system. The resulting values are the tensor components measured in that coordinate system.

Similar threads

  • Linear and Abstract Algebra
Replies
7
Views
204
  • Linear and Abstract Algebra
Replies
10
Views
326
  • Linear and Abstract Algebra
Replies
2
Views
897
  • Linear and Abstract Algebra
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
32
Views
3K
  • Linear and Abstract Algebra
Replies
1
Views
816
  • Linear and Abstract Algebra
Replies
9
Views
160
  • Linear and Abstract Algebra
Replies
9
Views
540
  • Linear and Abstract Algebra
Replies
2
Views
914
  • Linear and Abstract Algebra
Replies
3
Views
929
Back
Top