Tensor: Definition, Examples & n,m Meaning

In summary, the conversation discusses the definition of a tensor and its properties. It is clarified that a tensor is not just a relation between two vectors, but a linear object that maps n vectors and m one-forms into real numbers and transforms in a coordinate invariant manner. The conversation also touches on the (n,m) notation used in the examples section and explains that n and m represent the number of products of V* and V, respectively, in the domain of the map.
  • #1
subsonicman
21
0
I was reading this page: http://en.wikipedia.org/wiki/Tensor
which said the definition of a tensor was a relation between two vectors. I then went down to the examples section and it had some sort of (n,m) notation. I had some theories on what they meant but none of them made sense. What do n and m represent?
 
Mathematics news on Phys.org
  • #2
##n## is just the number of products of ##V^{*}## and ##m## is the number of products of ##V## which comprise the domain of the map; the codomain is just the reals.
 
  • #3
subsonicman said:
which said the definition of a tensor was a relation between two vectors.

That is hardly true, a tensor is a linear object which maps n vectors and m one-forms into real numbers, and transforms in a coordinate invariant manner.
That is like saying multiplication is defined as a relation between two numbers.
 

What is a Tensor?

A tensor is a mathematical object that describes the relationships between different sets of data. It can be thought of as a higher-dimensional generalization of a vector or matrix, and is used in many fields such as physics, engineering, and machine learning.

What are some examples of Tensors?

Tensors can be found in many real-world applications. Some examples include stress and strain tensors in solid mechanics, the energy-momentum tensor in general relativity, and the weight matrix in neural networks.

What do the n and m values in a Tensor represent?

In a tensor of order n,m, the n and m values represent the number of dimensions in the input and output spaces, respectively. For example, a 2,3 tensor would have 2 dimensions in the input space and 3 dimensions in the output space.

How are Tensors different from Vectors and Matrices?

Tensors differ from vectors and matrices in that they can have an arbitrary number of dimensions, whereas vectors are 1-dimensional and matrices are 2-dimensional. Tensors also have different transformation properties and can be manipulated using different mathematical operations.

Why are Tensors important in machine learning?

Tensors are important in machine learning because they can represent complex relationships between data and can be used to model and solve a variety of problems. They are also used in deep learning algorithms, which have revolutionized the field of artificial intelligence in recent years.

Similar threads

  • Linear and Abstract Algebra
Replies
1
Views
789
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
881
  • Science and Math Textbooks
Replies
1
Views
848
  • General Math
Replies
1
Views
4K
  • Linear and Abstract Algebra
Replies
5
Views
985
  • Special and General Relativity
2
Replies
38
Views
5K
Replies
16
Views
2K
  • General Math
Replies
7
Views
1K
Replies
1
Views
2K
Back
Top