Is There Meaning Behind (0,1) as a Tensor?

In summary: So, I think that it would be helpful to connect this General Relativity approach to the mathematical approach that you have explained in this Insight.I agree 100%. A lot of students struggle with understanding what a tensor is from a mathematical perspective before they ever encounter them in physics or quantum mechanics. As you mentioned, it can be helpful to explain it in terms of a transformation between vector spaces, as well as how linear operators act on tensor products.
  • #36
StoneTemplePython said:
It probably should be noted that when moving from a 2-D matrix to something like a 3-D or 4-D (or n-D) tensor, is a bit like moving from 2-SAT to 3-SAT... most of the interesting things you'd want to do computationally (e.g. numerically finding eigenvalues or singular values) become NP Hard ( E.g. see: https://arxiv.org/pdf/0911.1393.pdf )
Yes, interesting, isn't it? This tiny difference between ##2## and ##3## which decides, whether we're too stupid to handle those problems, or whether there is a system immanent difficulty. And lower bounds are generally hard to prove. I know that Strassen lost a bet on ##NP = P##. I've forgotten the exact year, but he thought we would have found out something in the 90's. But I guess he enjoyed the journey in a balloon over the Alps anyway.
 
Last edited:
Physics news on Phys.org
  • #37
WWGD said:
Thanks, but aren't there naturally-occurring tensors in which the factors are mixed? What do you then do?
Perhaps if you consider tensor algebras of ##\operatorname{Hom}(V,V^*)## or similar. I would group them pairwise in such a case: all even indexed ##V## and all odd indexed ##V^*##. This is what I really learned about tensors: it all heavily depends on what you want to do.
 
  • Like
Likes WWGD
  • #38
fresh_42 said:
Perhaps if you consider tensor algebras of ##\operatorname{Hom}(V,V^*)## or similar. I would group them pairwise in such a case: all even indexed ##V## and all odd indexed ##V^*##. This is what I really learned about tensors: it all heavily depends on what you want to do.
Maybe we are referring to different things, but if we have a multilinear map defined on , say, ## V \otimes V^{*} \otimes V ## then the map would be altered by defining it on ## V \otimes V \otimes V^{*} ##, wouldn't it?
 
  • #39
Say we have a map ##V \otimes V^* \otimes V = V_1 \otimes V^* \otimes V_2 \longrightarrow W##, then it is an element of ##V_1 \otimes V^* \otimes V_2 \otimes W## which could probably be grouped as ##V^* \otimes V_1 \otimes V_2 \otimes W## and we have the original grouping again. I don't know of an example, where the placing of ##V^*## depends on the fact, that it is in between the copies of ##V##. As soon as algebras play a role, we factor their multiplication rules anyway. Or even better in a way such that the contravariance of ##W## is respected.
 
  • #40
I don't know if you mentioned this, but I think another useful perspective here is that the tensor product also defines a map taking a k-linear map into a linear map ( on the tensor product ; let's stick to vector spaces over ## \mathbb R ## and maps into the Reals, to keep it simple for now) , so that there is a map taking, e.g., the dot product ( as a bilinear map, i.e., k=2 ) on ## \mathbb R^2 \times \mathbb R^2 ## into a linear map defined on ## \mathbb R^2 \otimes \mathbb R^2 ## ( Into the Reals, in this case ), so we have a map from {## K :V_1 \times V_2 \times...\times V_k ##} to {## L:V_1 \otimes V_2 \otimes...\otimes V_k ##} , where K is a k-linear map and L is linear. This perspective helps me understand things better.
 
Last edited:
  • #41
I listed my originally intended chapters here:
https://www.physicsforums.com/threads/what-is-a-tensor-comments.917927/#post-5788263
where universality, natural isomorphisms and what else comes to mind considering tensors would have been included, but this tended to became about 40-50 pages and I wasn't really prepared for such a long explanation ... And after this debate here, I'm sure that even then there would have been some who thought I left out an essential part or described something differently from what they are used to and so on. Would have been interesting to learn more about the physical part of it, the more as a tensor to me is merely a multilinear product, which only gets interesting if a subspace is factored out. If there only wasn't these coordinate transformations and indices wherever you look. :wideeyed:
 
  • #42
WWGD said:
Is there a reason why we group together the (contra/co) variant factors? Why not have , e.g., ## T^p_q = V \ \otimes V^{*} \otimes V... ## , etc ?

Isn't this just a matter of convention? If you have a tensor [itex]t[/itex] of type [itex]V \otimes V^* \otimes V[/itex], anything you want to do with [itex]t[/itex], you can do the analogous thing with the tensor [itex]t'[/itex] of type [itex]V \otimes V \otimes V^*[/itex]. There is only a notational difficulty, which is indicating which arguments of one tensor are contracted with which other arguments of a different tensor. But the Einstein summation convention makes this explicit.
 
  • #43
stevendaryl said:
Isn't this just a matter of convention? If you have a tensor [itex]t[/itex] of type [itex]V \otimes V^* \otimes V[/itex], anything you want to do with [itex]t[/itex], you can do the analogous thing with the tensor [itex]t'[/itex] of type [itex]V \otimes V \otimes V^*[/itex]. There is only a notational difficulty, which is indicating which arguments of one tensor are contracted with which other arguments of a different tensor. But the Einstein summation convention makes this explicit.
I meant not just for contraction but for describing the general type ( co- and contra- variant) of the tensor; I was wondering if a tensor with mixed components , like ## V \otimes V^{*} \otimes V \otimes V^{*}... ## could always be expressible as ## V \otimes V \otimes ...V^{*} \otimes V^{*}... ## ., though I agree that when contracting the order does not matter. Basically, could we use contraction to show the two above types are equivalent? I am being kind of lazy, let me try it.
 
  • #44
Concerning this point:

lavinia said:
Given a linear map between two vector spaces ##L:V →W## then ##L## determines a map of the algebra of tensor products of vectors in ##V## to the algebra of tensor products of vectors in ##W##. This is correspondence is a covariant functor. ##L## also determines a map of the algebra of tensor products of dual vectors in ##W## to the algebra of tensor products of dual vectors in ##V##. This correspondence is a contravariant functor.

One might guess that this is the reason for the terms covariant and contravariant tensor though I do not know the history.

fresh_42 said:
I agree. This would be a natural way to look at it. However, the German Wikipedia does it the other way around and the English speaks of considering ##V## as ##V^{**}## and refers to basis transformations as the origin of terminology. I find this a bit unsatisfactory as motivation but failed to find a good reason for a different convention.

This is what Laurent Schwartz writes in "Les Tenseurs" in 1975:
Ces règles sont bien commodes pour les calculs techniques, mais elles ont pour base une erreur historique, qui n'a pas fini de canuler l'humanité pour plusieurs siècles. Elles furent établies à une époque où on manipulait plus les coordonnées que les vecteurs. Elles aboutissent ainsi à appeler contravariant (contra = contre) ce qui est relatif à ##E##, covariant (co = avec) ce qui est relatif à ##E^*## ! Dans tous les raisonnements théoriques utilisant des produits tensoriels (et ils couvrent aujourd'hui toutes les mathématiques), c'est une catastrophe. Un vecteur de ##E## (resp. ##E^*##) est appelé tenseur contravariant (resp. covariant) ! Ce qui est vrai (et c'est de là que vient l'apellation) c'est que le système de coordonnées d'un vecteur (i.e. "le tenseur ##x^i = \langle \epsilon^i, x \rangle##", ##\epsilon^i## formant la base duale) est contravariant ; mais, dans les mathématiques modernes, un vecteur est autre chose que le système de ses coordonnées ! Il aurait fallu appeler tenseur covariant un élément de ##E##, tenseur contravariant un élément de ##E^*##, quitte à faire remarquer que les coordonnées varient en sens inverse.
which can be translated by:
These rules are very convenient for technical calculations, but they are based on a historical error, which will continue to play a joke on humanity for several centuries. They were established at a time when coordinates were manipulated more than vectors. They results in calling contravariant (contra = against) what is relative to ##E##, covariant (co = with), which is relative to ##E ^ *## ! In every theoretical reasoning using tensorial products (and they cover all mathematics today), it is a catastrophe. A vector of ##E## (respectively ##E ^ *##) is called a contravariant tensor (respectively a covariant tensor)! What is correct (and this is the origin of such a terminology) is that the system of coordinates of a vector (ie, the tensor ##x^i = \langle \epsilon^i, x \rangle##", ##\epsilon^i## being the dual basis) is contravariant; But in modern mathematics a vector is something else than the system of its coordinates! An element of ##E## should have been called covariant tensor, an element of ##E ^ *## contravariant tensor, even if we point out that the coordinates vary in the opposite direction.
 
  • Like
Likes martinbn and fresh_42
  • #45
If I understand the point: if one writes a vector in terms of a basis then its coefficients are picked out by the dual basis. So the coefficients are contravariant.
 
  • #46
An clear exposition of the Physics approach to tensors is in Leonard Susskind's Lectures on General Relativity starting somewhere around minute 40 in lecture 3.

 
  • Like
Likes anorlunda, Greg Bernhardt and jim mcnamara
  • #47
"A scalar can be viewed as the coordinate of one dimensional vector space, the component of a basis Vector."
Respected Sir,
can you please explain this statement that you made in your answer?
 
  • #48
Deepak Solanki said:
"A scalar can be viewed as the coordinate of one dimensional vector space, the component of a basis Vector."
Respected Sir,
can you please explain this statement that you made in your answer?
If we have a ##1-##dimensional vector space ##V## with a basis vector ##\vec{b}##, then all vectors ##\vec{v}## can be written ##\vec{v}=c \cdot \vec{b}##. This means ##c## is the scalar, which transforms ##\vec{b}## to ##\vec{v}##, the coordinate of ##\vec{v}## in the basis ##\{\vec{b}\}## and the component of ##\vec{v}## with respect to ##\vec{b}##. And it constitutes an isomorphism ##c \leftrightarrow \vec{v}## between the field ##\mathbb{F}## and ##V##.
 
  • #49
Deepak Solanki said:
"A scalar can be viewed as the coordinate of one dimensional vector space, the component of a basis Vector."
Respected Sir,
can you please explain this statement that you made in your answer?
Think of the Reals as a vector space over itself. Then any vector/Real number is a multiple of any non-zero number. Generalize this to any other 1D v. space over the Reals.
 
  • #50
This cut from Wikipedia shows a motive of using tensors:

"Because they express a relationship between vectors, tensors themselves must be independent of a particular choice of basis. The basis independence of a tensor then takes the form of a https://www.physicsforums.com/x-dictionary:r:'Covariant_transformation?lang=en&signature=com.apple.DictionaryApp.Wikipedia' that relates the array computed in one basis to that computed in another one. "

I believe this might be one of the most important characteristics of tensors for differential geometry and general relativity. (both essentially over my head)

Thanks for taking the time and effort to write this article.
 
  • Like
Likes FactChecker
  • #51
Thuring said:
This cut from Wikipedia shows a motive of using tensors:

"Because they express a relationship between vectors, tensors themselves must be independent of a particular choice of basis. The basis independence of a tensor then takes the form of a https://www.physicsforums.com/x-dictionary:r:'Covariant_transformation?lang=en&signature=com.apple.DictionaryApp.Wikipedia' that relates the array computed in one basis to that computed in another one. "

I believe this might be one of the most important characteristics of tensors for differential geometry and general relativity. (both essentially over my head)

Thanks for taking the time and effort to write this article.
I agree that this is the key feature of a tensor. It is an entity that is defined in such a way that its representations in different coordinate systems satisfy the covariant/contravariant transformation rules. Mathematically, a tensor can be considered an equivalence class of coordinate system representations that satisfy the covariant/contravariant transformation rules. This gives tensors the great advantage of being coordinate system agnostic.
 
  • #52
FactChecker said:
This gives tensors the great advantage of being coordinate system agnostic.
Yes, but this is similar difficult as to why vectors and linear transformations, although usually represented by an array of numbers or a matrix, are not those arrays but rather entities on their own, as you said agnostic to coordinates. A tensor is in the end merely a continuation of scalar ##\rightarrow## vector ##\rightarrow## matrix to simply higher dimensions. They often appear as if they were something special, if we speak of stress-energy tensors or curvature tensors. This is as if we associated automatically a force with a vector, or a rotation with a matrix. I often get the impression that physicists use the term tensor but mean a certain example. It's just a multilinear array or transformation - any multilinear transformation, which are already two interpretations of the same thing. The entire co-contra-variant stuff is also an interpretation, and - in my mind - sometimes a bit deliberate.
 
  • #53
Sooo, are you hinting that perhaps the biggest advantage to using "tensors" is the notation?
 
  • #54
Perhaps the biggest difference between vectors, matrices, and linear algebra with "tensors" is the attitude or conception of the users. My simple minded pragmatic definition of a tensor essentially is a matrix using the tensor notation. Matrices are easily visualized, they have a shape and size. Tensors notation considers only one
coefficient at a time, but a lot of them.

(Butt then, eye am knot a reel physicist)
 
  • #55
Thuring said:
Sooo, are you hinting that perhaps the biggest advantage to using "tensors" is the notation?
To some extend, yes. Tensors on the other hand are quite variable, same as matrices are. E.g. scalars, vectors and matrices are also tensors. And they build a tensor algebra with a universal property, i.e. many algebras can be realized as quotient algebras of the tensor algebra. So it is the same as with vectors: it all depends on what we use it for. In the end they are simply an arrow, an arrow that serves many, many applications.
 
  • Like
Likes Thuring
  • #56
fresh_42 said:
Yes, but this is similar difficult as to why vectors and linear transformations, although usually represented by an array of numbers or a matrix, are not those arrays but rather entities on their own, as you said agnostic to coordinates.
Yes, if they are defined in a way that is agnostic to coordinate systems. It is possible to define tuples that can not have any physical or geometric meaning. I can define the tuple (1,0) in all coordinate systems, but it does not transform at all -- it is (1,0) in any coordinate system. It is not a tensor. Tensors can have a physical or geometric meaning that is independent of the choice of coordinate system. The tuple (1,0) defined that way regardless of coordinate system can not have a physical or geometric meaning. There are similar examples for matrices and they can be found throughout mathematics.
 
  • #57
Of course, because you defined ##(0,1)## as a tuple plus the absence of meaning and then reasoned that it has no meaning. That's a tautology. ##(0,1)## has a meaning, as soon as it is associated with a point in a coordinate system, namely the vector from the origin to this point, even over ##\mathbb{Z}_2##. It transforms, at the very least by permutation of the axis to ##(1,0)## and is of course a tensor, as all vectors are. Examples of matrices which aren't a representative of a linear transformation in some coordinate system, need to have a meaning attached to them, which excludes such a transformation. Maybe a matrix of pixels in an image. But even then each entry has a RGB coordinate and is again in some sense a tensor. Whether this tensor as a multilinear object makes any sense is another question.
 
  • #58
fresh_42 said:
Of course, because you defined ##(0,1)## as a tuple plus the absence of meaning and then reasoned that it has no meaning.
It has no physical or geometric meaning, but can have mathematical meaning and properties. Those concepts are common in mathematics.
 

Similar threads

  • Linear and Abstract Algebra
Replies
7
Views
209
  • Linear and Abstract Algebra
Replies
10
Views
331
  • Linear and Abstract Algebra
Replies
32
Views
3K
  • Linear and Abstract Algebra
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
915
  • Linear and Abstract Algebra
Replies
1
Views
817
Replies
1
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
898
Replies
5
Views
3K
  • Linear and Abstract Algebra
Replies
1
Views
1K
Back
Top