Understanding Tensors: Exploring Their Operations and Applications

  • Thread starter GarageDweller
  • Start date
  • Tags
    Tensors
In summary, when a tensor is supplied with one or two of its arguments, it can produce a tensor with a lower order than the original. This is known as contraction, where the indices of the tensor are reduced through summation. This is a general property of tensors, regardless of their rank, as it is simply the inner product that is being used.
  • #1
GarageDweller
104
0
Hi, I've seen in some texts where a tensor is only supplied with one(or two) of it's arguments when it has more than that, and produce a tensor with a lower order than the original.
Is this a formal operation?
For example, the moment of inertia tensor has 2 arguments, supplying it with an angular velocity vector gives an angular momentum vector.
 
Mathematics news on Phys.org
  • #2
Hey GarageDweller.

The term for reducing the number of indices (both upper and lower) of a tensor is known as contraction and the idea is that things eventually 'sum out' to remove that index from being one that can vary in the summation as opposed to something that is constant.

You'd probably be better off reading about contraction in tensor algebra than me telling to you because I don't think I'd do as much justice.
 
  • #3
Yes, it's a general property of [2nd order] tensors that when summed over one index with a vector the operation produces a new vector that in general is not in the same direction as the original vector. So with the moment of inertia tensor you would have [itex]L_{i} = I_{ij} \omega_{j}[/itex]. Similarly, if you sum 2nd order tensor over both indices with two vector you will get a scalar invariant value. Again, in the case of the moment of inertia tensor this would be the rotational kinetic energy [itex]2T = I_{ij} \omega_{i} \omega_{j}[/itex].

And yes it is a general property of tensor no matter the rank because it's simply the inner product that you're seeing. The inner product always reduces the rank of the tensor that is performing the operation by 1.
 

Related to Understanding Tensors: Exploring Their Operations and Applications

1. What are tensors and why are they important?

Tensors are mathematical objects that represent relationships between vectors and scalars in a multi-dimensional space. They are important in many areas of science, including physics, engineering, and computer science, because they allow us to model and understand complex systems and interactions between different variables.

2. How are tensors different from matrices?

Matrices are two-dimensional arrays of numbers, while tensors can have any number of dimensions. This means that tensors are more versatile and can represent more complex relationships between variables. Additionally, tensors have specific mathematical properties, such as rank and shape, that are not present in matrices.

3. What are some real-world applications of tensors?

Tensors have a wide range of applications in various fields. For example, in physics, tensors are used to describe the behavior of physical systems, such as in Einstein's theory of general relativity. In engineering, they are used in structural analysis and optimization. In machine learning, tensors are used to represent and manipulate data in neural networks.

4. How are tensors used in deep learning?

Tensors are an essential component of deep learning algorithms, as they are used to store and manipulate large amounts of data, such as images, text, and audio. In deep learning, tensors are used to represent the weights and biases of a neural network, as well as the input and output data of each layer. This allows the network to learn and make predictions based on the relationships between the different variables.

5. Are there any resources for learning more about tensors?

Yes, there are many online resources available for learning more about tensors, including tutorials, courses, and textbooks. Some recommended resources include "Introduction to Tensors for Students of Physics and Engineering" by Joseph C. Kolecki and "Tensor Analysis and Elementary Differential Geometry for Physicists and Engineers" by Hung Nguyen-Schäfer.

Similar threads

  • Quantum Physics
Replies
11
Views
1K
Replies
2
Views
1K
  • General Math
Replies
1
Views
4K
  • Special and General Relativity
Replies
10
Views
2K
  • Special and General Relativity
2
Replies
38
Views
5K
  • Special and General Relativity
Replies
3
Views
1K
  • Classical Physics
Replies
5
Views
2K
  • Special and General Relativity
Replies
1
Views
558
Replies
10
Views
1K
  • Special and General Relativity
Replies
6
Views
1K
Back
Top