# The tensor product of tensors confusion

• I
• GR191511
In summary, a tensor is a multilinear function that takes in vectors and covectors and produces a number. The tensor product of two tensors, T1 and T2, can be viewed as an (r1+r2 s1+s2) tensor, where r1 and s1 denote the number of vectors and covectors in T1, and r2 and s2 denote the number of vectors and covectors in T2. The target object that the tensor product acts on is a set of vectors and covectors in a vector space V, and the result of applying the product tensor is obtained by multiplying the results of applying the two multiplicand tensors to their respective arguments.

#### GR191511

TL;DR Summary
Why is the tensor product of two tensors again a tensor?
> **Exercise.** Let T1and T2be tensors of type (r1 s1)and (r2 s2) respectively on a vector space V. Show that T1⊗
T2can be viewed as an (r1+r2 s1+s2)tensor, so that the
> tensor product of two tensors is again a tensor, justifying the
> nomenclature...
What I’m reading：《An introduction to tensors and group theory for physicists》Authors: Jeevanjee, Nadir. According to it，
a tensor is a multilinear function that eats r vectors as well as s dual vectors and produces a number...And "Give two finite-dimensional vector spaces V and W,we define their tensor product V⊗W to be the set of all C-valued bilinear functions on V*×W *..."
How do I define the tensor product of two tensors by these definition?And What the target object this tensor product acts on should look like？

GR191511 said:
Summary:: Why is the tensor product of two tensors again a tensor?

> **Exercise.** Let T1and T2be tensors of type (r1 s1)and (r2 s2) respectively on a vector space V. Show that T1⊗
T2can be viewed as an (r1+r2 s1+s2)tensor, so that the
> tensor product of two tensors is again a tensor, justifying the
> nomenclature...
What I’m reading：《An introduction to tensors and group theory for physicists》Authors: Jeevanjee, Nadir. According to it，
a tensor is a multilinear function that eats r vectors as well as s dual vectors and produces a number...And "Give two finite-dimensional vector spaces V and W,we define their tensor product V⊗W to be the set of all C-valued bilinear functions on V*×W *..."
How do I define the tensor product of two tensors by these definition?And What the target object this tensor product acts on should look like？
A tensor is a multilinear function, but it doesn't produce a number. Only the dual vectors if fed by vectors turn into scalars, the part with the vectors remains.

We have
$$T_i=\underbrace{v_1\otimes v_2\otimes \ldots\otimes v_{r_i}}_{r_i\text{ many vectors}}\otimes \underbrace{v^*_1\otimes v^*_2\otimes \ldots\otimes v^*_{s_i}}_{s_i\text{ many covectors}}\quad (i=1,2)$$
so
$$T_1\otimes T_2=v_1\otimes v_2\otimes \ldots\otimes v_{r_1}\otimes v^*_1\otimes v^*_2\otimes \ldots\otimes v^*_{s_1}\otimes v_1\otimes v_2\otimes \ldots\otimes v_{r_2}\otimes v^*_1\otimes v^*_2\otimes \ldots\otimes v^*_{s_2}$$
which can be sorted to an equivalent tensor
$$T_1\otimes T_2=\underbrace{v_1\otimes v_2\otimes \ldots\otimes v_{r_1}\otimes v_1\otimes v_2\otimes \ldots\otimes v_{r_2}}_{r_1+r_2 \text{ many vectors}} \otimes \underbrace{v^*_1\otimes v^*_2\otimes \ldots\otimes v^*_{s_1}\otimes v^*_1\otimes v^*_2\otimes \ldots\otimes v^*_{s_2}}_{s_1+s_2 \text{ many covectors}}$$

Here is an example of a ##(1,2)## tensor, which is a bilinear function:
https://www.physicsforums.com/insights/what-is-a-tensor/

fresh_42 said:
A tensor is a multilinear function, but it doesn't produce a number. Only the dual vectors if fed by vectors turn into scalars, the part with the vectors remains.
I think we can say it produces a number. If we take the usual definition of a (m n) tensor over vector space V on field F as a multi-linear function whose domain is the cartesian product of a collection of m copies of V and n copies of the dual space V*, and whose range is F, then we can say the function 'produces a number', meaning just that it maps ordered tuples of m vectors and n covectors to numbers (scalars).

GR191511 said:
How do I define the tensor product of two tensors by these definition?And What the target object this tensor product acts on should look like？
If for ##i\in\{1,2\}##, tensor ##T_i## over vector space V and field F is ##(m_i\ \ n_i)##, it maps ##m_i## vectors in V (##v_{i,1}, v_{i,2}, ..., v_{i,m_i} ## ) and ##n_i## covectors in V* ( ## p_{i,1}, p_{i,2}, ..., p_{i,n_i}## ) to a number in F, denoted by :
$$T_i(v_{i,1}, v_{i,2}, ..., v_{i,m_i}, p_{i,1}, p_{i,2}, ..., p_{i,n_i})$$

then the vector ##T_1\otimes T_2## is ##(m_1+m_2\ \ n_1+n_2)## and maps ##m_1+m_2## vectors in V (##u_1,u_2, .., u_{m_1+m_2}## ) and ##n_1+n_2## covectors in V* ( ## q_1, q_2,..., q_{n_1+n_2}## ) to a number in F, denoted by
$$(T_1\otimes T_2) (u_1,u_2, .., u_{m_1+m_2}, q_1, q_2,..., q_{n_1+n_2})$$
which is equal to
$$T_1(u_1, u_2, ..., u_{m_1}, p_1, p_2, ..., p_{n_1})\times T_1(u_{m_1+1}, u_{m_1+2}, ..., u_{m_1+m_2}, p_{n_1+1}, p_{n_1+2}, ..., p_{n_1+n_2})$$
In other words, to get the result from applying the product tensor, you just multiply the results of applying the two mutiplicand tensors to their respective arguments taken from the appropriate places in the input tuple to the product.

andrewkirk said:
I think we can say it produces a number. If we take the usual definition of a (m n) tensor over vector space V on field F as a multi-linear function whose domain is the cartesian product of a collection of m copies of V and n copies of the dual space V*, and whose range is F, then we can say the function 'produces a number', meaning just that it maps ordered tuples of m vectors and n covectors to numbers (scalars).
No, we cannot. You can feed the covectors, but the vectors remain vectors. ##\sum u_i\otimes v_i## is a matrix and not a number.

weirdoguy
I think you are misreading the OP. 'it produces a number' doesn't say the tensor product is a number. It says that it is a function that delivers a result that is a number - ie the function's range is a space of numbers (typically ##\mathbb R## or ##\mathbb C##),

fresh_42 said:
A tensor is a multilinear function, but it doesn't produce a number. Only the dual vectors if fed by vectors turn into scalars, the part with the vectors remains.

We have
$$T_i=\underbrace{v_1\otimes v_2\otimes \ldots\otimes v_{r_i}}_{r_i\text{ many vectors}}\otimes \underbrace{v^*_1\otimes v^*_2\otimes \ldots\otimes v^*_{s_i}}_{s_i\text{ many covectors}}\quad (i=1,2)$$
so
$$T_1\otimes T_2=v_1\otimes v_2\otimes \ldots\otimes v_{r_1}\otimes v^*_1\otimes v^*_2\otimes \ldots\otimes v^*_{s_1}\otimes v_1\otimes v_2\otimes \ldots\otimes v_{r_2}\otimes v^*_1\otimes v^*_2\otimes \ldots\otimes v^*_{s_2}$$
which can be sorted to an equivalent tensor
$$T_1\otimes T_2=\underbrace{v_1\otimes v_2\otimes \ldots\otimes v_{r_1}\otimes v_1\otimes v_2\otimes \ldots\otimes v_{r_2}}_{r_1+r_2 \text{ many vectors}} \otimes \underbrace{v^*_1\otimes v^*_2\otimes \ldots\otimes v^*_{s_1}\otimes v^*_1\otimes v^*_2\otimes \ldots\otimes v^*_{s_2}}_{s_1+s_2 \text{ many covectors}}$$

Here is an example of a ##(1,2)## tensor, which is a bilinear function:
https://www.physicsforums.com/insights/what-is-a-tensor/

fresh_42 said:
A tensor is a multilinear function, but it doesn't produce a number. Only the dual vectors if fed by vectors turn into scalars, the part with the vectors remains.

We have
$$T_i=\underbrace{v_1\otimes v_2\otimes \ldots\otimes v_{r_i}}_{r_i\text{ many vectors}}\otimes \underbrace{v^*_1\otimes v^*_2\otimes \ldots\otimes v^*_{s_i}}_{s_i\text{ many covectors}}\quad (i=1,2)$$
so
$$T_1\otimes T_2=v_1\otimes v_2\otimes \ldots\otimes v_{r_1}\otimes v^*_1\otimes v^*_2\otimes \ldots\otimes v^*_{s_1}\otimes v_1\otimes v_2\otimes \ldots\otimes v_{r_2}\otimes v^*_1\otimes v^*_2\otimes \ldots\otimes v^*_{s_2}$$
which can be sorted to an equivalent tensor
$$T_1\otimes T_2=\underbrace{v_1\otimes v_2\otimes \ldots\otimes v_{r_1}\otimes v_1\otimes v_2\otimes \ldots\otimes v_{r_2}}_{r_1+r_2 \text{ many vectors}} \otimes \underbrace{v^*_1\otimes v^*_2\otimes \ldots\otimes v^*_{s_1}\otimes v^*_1\otimes v^*_2\otimes \ldots\otimes v^*_{s_2}}_{s_1+s_2 \text{ many covectors}}$$

Here is an example of a ##(1,2)## tensor, which is a bilinear function:
https://www.physicsforums.com/insights/what-is-a-tensor/
Why can s1 covectors and r2 vectors commute with each other at the last step？

andrewkirk said:
I think you are misreading the OP. 'it produces a number' doesn't say the tensor product is a number. It says that it is a function that delivers a result that is a number - ie the function's range is a space of numbers (typically ##\mathbb R## or ##\mathbb C##),
Sure, but this is definitely wrong! Except we have tensors of type ##(0,m)##.

If we have ##(n,m)## tensors with ##n>0## then there is no way to make ##n## go away. E.g. a ##(1,2)## tensor is - considered as a function - an algebra multiplication. Since when produces an algebra multiplication a number?

GR191511 said:
Why can s1 covectors and r2 vectors commute with each other at the last step？
It is not that you commute the factors. It is strictly speaking an isomorphism of vector spaces, or tensor spaces if you like. An isomorphism that isn't written down as it doesn't really carry information. It's simply convenient to sort them into the ##(n,m)## scheme. And if you consider a tensor space ##T(V)## over a vector space ##V## as an algebra with the multiplication ##(T_1,T_2)\longmapsto T_1\otimes T_2## then you have to apply this isomorphism. Loosely speaking: it doesn't matter if we are talking about length times height or height times length, although there would be a formal difference.

fresh_42 said:
Sure, but this is definitely wrong! Except we have tensors of type ##(0,m)##.

If we have ##(n,m)## tensors with ##n>0## then there is no way to make ##n## go away. E.g. a ##(1,2)## tensor is - considered as a function - an algebra multiplication. Since when produces an algebra multiplication a number?
I don't see what the problem is! The input is vectors and one forms, the output is a number.

martinbn said:
I don't see what the problem is! The input is vectors and one forms, the output is a number.
What number? Repetition does not make it right. Let ##V## be a vector space, and ##v\in V##. Then ##v## is obviously a vector and not a number. However, ##v## is a tensor.

If ##v\in V^*##, then ##v^*(w)## is a function that produces a number, but not in case ##v\in V##.

fresh_42 said:
What number? Repetition does not make it right. Let ##V## be a vector space, and ##v\in V##. Then ##v## is obviously a vector and not a number. However, ##v## is a tensor.

If ##v\in V^*##, then ##v^*(w)## is a function that produces a number, but not in case ##v\in V##.
In this case ##v## can be viewed as an element of ##V^{**}##, so for each ##\omega\in V^*## you get the number ##v(\omega)##.

martinbn said:
In this case ##v## can be viewed as an element of ##V^{**}##, so for each ##\omega\in V^*## you get the number ##v(\omega)##.
Yes, and the projection on the first coordinate is a number, too. The question was "what is a tensor" and not "what can I do to make it a number". To call a tensor a multilinear function into the scalar field is wrong. Your poor construction as an excuse is additional framework. It is as if you were saying that all vectors were covectors. This is confusing and wrong, at least in mathematics. That physicists always make vectors covectors is another subject.

A vector is a tensor, and it is neither a function nor a number.

Repetition of wrong does not change this fact.

weirdoguy
fresh_42 said:
Yes, and the projection on the first coordinate is a number, too. The question was "what is a tensor" and not "what can I do to make it a number". To call a tensor a multilinear function into the scalar field is wrong. Your poor construction as an excuse is additional framework. It is as if you were saying that all vectors were covectors. This is confusing and wrong, at least in mathematics. That physicists always make vectors covectors is another subject.

A vector is a tensor, and it is neither a function nor a number.

Repetition of wrong does not change this fact.
This is standard in mathematics. Many define tensors as multilinear maps. It is equavalent to what you consider the only definition and cannot be wrong. It is a matter of personal prefernce.

It is also not hard to see that the space of all linear maps

##V^*\times\cdots\times V^*\times V\times\cdots\times V \rightarrow \mathbb R##

and the space

##V\otimes\cdots\otimes V\otimes V^*\otimes\cdots\otimes V^* ##

are cannonically isomorphic.

martinbn said:
This is standard in mathematics. Many define tensors as multilinear maps. It is equavalent to what you consider the only definition and cannot be wrong. It is a matter of personal prefernce.

It is also not hard to see that the space of all linear maps

##V^*\times\cdots\times V^*\times V\times\cdots\times V \rightarrow \mathbb R##

and the space

##V\otimes\cdots\otimes V\otimes V^*\otimes\cdots\otimes V^* ##

are cannonically isomorphic.
It would be more accurate to say: all tensors are vectors, scalars and covectors included, than the other way around.

Mathematics distinguishes between scalars, vectors, and covectors. Any isomorphisms come afterward!

If it comes to differential forms or tangent spaces, things become complicated. ##dx## is still a vector, a tangent, but it can be evaluated at a certain point which makes it a number, a slope. So some tangent vectors can be seen as functions that produce numbers, but this is not what a tensor is. It is how it can be used. That is something different.

E.g. I can write ##\mathfrak{su}(2)## as a Lie algebra of differential operators. Its multiplication structure is a bilinear mapping ##\mathfrak{su}(2) \times \mathfrak{su}(2) \longrightarrow \mathfrak{su}(2)## which is a ##(2,1)## tensor. And although the vectors are written as differentials, the tensor itself doesn't become a number-producing machine. It produces differential operators.

To consider curvature tensors or alike, one has to set additional structures upon the tensor. It is what you make out of it, not what it is. A tensor is still a simple vector because it is part of a vector space. And vectors and covectors are different objects, mathematically.

The phrasing in the OP's book is misguiding. Not all vectors are covectors, but all covectors are vectors.

Last edited:
fresh_42 said:
It would be more accurate to say: all tensors are vectors, scalars and covectors included, than the other way around.

Mathematics distinguishes between scalars, vectors, and covectors. Any isomorphisms come afterward!

If it comes to differential forms or tangent spaces, things become complicated. ##dx## is still a vector, a tangent, but it can be evaluated at a certain point which makes it a number, a slope. So some tangent vectors can be seen as functions that produce numbers, but this is not what a tensor is. It is how it can be used. That is something different.

E.g. I can write ##\mathfrak{su}(2)## as a Lie algebra of differential operators. Its multiplication structure is a bilinear mapping ##\mathfrak{su}(2) \times \mathfrak{su}(2) \longrightarrow \mathfrak{su}(2)## which is a ##(2,1)## tensor. And although the vectors are written as differentials, the tensor itself doesn't become a number-producing machine. It produces differential operators.

To consider curvature tensors or alike, one has to set additional structures upon the tensor. It is what you make out of it, not what it is. A tensor is still a simple vector because it is part of a vector space. And vectors and covectors are different objects, mathematically.

The phrasing in the OP's book is misguiding. Not all vectors are covectors, but all covectors are vectors.
I am not sure how this relates to what I said. My point is simply that there is nothing wrong, and is often done especially in differential geometry books, to define tensors as multilinear maps. If the OP uses a book that goes that way, does it help to call it "wrong"!?

It is wrong to treat all vectors as covectors! You can do it a) the other way around, or b) do not ask a mathematician, or c) learn it wrong.

We would lead the very same discussion the other way around if I would claim that all vectors are covectors and that there is no difference between them. You bet!

fresh_42 said:
It is wrong to treat all vectors as covectors! You can do it a) the other way around, or b) do not ask a mathematician, or c) learn it wrong.

We would lead the very same discussion the other way around if I would claim that all vectors are covectors and that there is no difference between them. You bet!
I am still not sure what you mean. What do you mean by "treat all vectors as covectors"?

martinbn said:
I am still not sure what you mean. What do you mean by "treat all vectors as covectors"?
Sure, that is what you repeatedly and falsely claim. You need covectors to get scalars, and you claim that all parts of a tensor are covectors. Sorry, but this is still wrong. I have provided multiple examples where such a statement is obviously wrong, yet, you insist that vectors are covectors. And, no, do not say no, this is exactly what it means if you want to spit out numbers. Disguise it as you want: Covectors are vectors (but primarily not in this context where both were distinguished already in the definition of ##(n,m)## or ##(r,s)##), but vectors are not covectors, everything else is voodoo.

Here is what I said: Given two (any number) vector spaces ##V## and ##W## over a field ##F##, the vector spaces ##T=V\otimes W## and ##L=\text{all multilinear maps from } V^*\times W^*\text{ to }F## are canonically isomorphic. This is TRUE. Therefore one can, and one does sometimes, use the second one as a definition of the first one.

martinbn said:
Here is what I said: Given two (any number) vector spaces ##V## and ##W## over a field ##F##, the vector spaces ##T=V\otimes W## and ##L=\text{all multilinear maps from } V^*\times W^*\text{ to }F## are canonically isomorphic. This is TRUE. Therefore one can, and one does sometimes, use the second one as a definition of the first one.

Because the definition says ##(r,s)##-tensor. This means ##r## vectors and ##s## covectors.

I am complaining that you basically claim it is a ##(0,r+s)##-tensor, which it is not.
(I know the difference, but that doesn't become clear at all by what you said.)

A ##(r,s)##-tensor ##T## regarded as a function is a multilinear map ##T: V^s \longrightarrow V^{\otimes r}## and not into a field. It is simply wrong and misleading to state otherwise.

Your entire claims are flawed and rely on additional constructions. The ##r## part are vectors and not functions. What makes me angry is, that I know that you know you are talking bs.

weirdoguy and martinbn
I give up. It is not my problem that you are stubborn.

weirdoguy, fresh_42 and Gaussian97
martinbn said:
I give up. It is not my problem that you are stubborn.
Yep, continue to tell that all vectors are covectors. I think you should write a paper. It would a revolution in homological algebra!

fresh_42 said:
Yep, continue to tell that all vectors are covectors. I think you should write a paper. It would a revolution in homological algebra!
I also studied tensors as multilinear maps from ##V^{*p} \times V^{q} \to \mathbb{R}## could you explain exactly how this implies that all vectors are covectors?

Gaussian97 said:
I also studied tensors as multilinear maps from ##V^{*p} \times V^{q} \to \mathbb{R}## could you explain exactly how this implies that all vectors are covectors?
##V^{*p}## eats ##p## many independent vectors and turns them into scalars.
But what do you do with ##V^{q}##? They remain vectors and do not map to ##\mathbb{R}##.
If you want to map them to ##\mathbb{R}##, then you have to make them covectors, i.e. eating vectors to turn them into numbers, or manipulate them otherwise.

Now my questions:
• If we consider this ##(p,q)##-tensor as multilinear mapping into ##\mathbb{R}##, why isn't it a ##(p+q,0)##-tensor to begin with?
• What happened to the vectors?
• Strassen's algorithm for matrix multiplication is a ##(2,1)##-tensor. Where does it end up in ##\mathbb{R}##?
• A vector isn't a function. If we treat it as such, then we must do something, namely setting ##(V^*)^* =V## or something else, e.g. evaluating a differential form at a certain point. These cannot be viewed as automatisms. We actively turn a vector into something else, mostly into ##v \leftrightsquigarrow (w \longmapsto \langle w,v \rangle).## And if there is no inner product?
• One has to be careful with infinite-dimensional spaces:
https://en.wikipedia.org/wiki/Dual_space#Infinite-dimensional_case

fresh_42 said:
##V^{*p}## eats ##p## many independent vectors and turns them into scalars.
But what do you do with ##V^{q}##? They remain vectors and do not map to ##\mathbb{R}##.
If you want to map them to ##\mathbb{R}##, then you have to make them covectors, i.e. eating vectors to turn them into numbers, or manipulate them otherwise.
No, you feed them covectors.
fresh_42 said:
Now my questions:
• If we consider this ##(p,q)##-tensor as multilinear mapping into ##\mathbb{R}##, why isn't it a ##(p+q,0)##-tensor to begin with?
• What happened to the vectors?
• Strassen's algorithm for matrix multiplication is a ##(2,1)##-tensor. Where does it end up in ##\mathbb{R}##?
• A vector isn't a function. If we treat it as such, then we must do something, namely setting ##(V^*)^* =V## or something else, e.g. evaluating a differential form at a certain point. These cannot be viewed as automatisms. We actively turn a vector into something else, mostly into ##v \leftrightsquigarrow (w \longmapsto \langle w,v \rangle).## And if there is no inner product?
• One has to be careful with infinite-dimensional spaces:
https://en.wikipedia.org/wiki/Dual_space#Infinite-dimensional_case
##V^{**}## is canonically isomorphic to ##V##. No extra structure needed, no choice of basis needed, no nothing, no matter the dimension.

weirdoguy
Mmm... Probably this is a silly question but when you say that ##V^{*p}## eats ##p## many vectors, do you mean that the elements of ##V^{*p}## are linear maps that eat ##p## many vectors? Otherwise, can you specify what do you mean?

fresh_42 said:
But what do you do with ##V^{q}##? They remain vectors and do not map to ##\mathbb{R}##.
If you want to map them to ##\mathbb{R}##, then you have to make them covectors, i.e. eating vectors to turn them into numbers, or manipulate them otherwise.
I don't understand this, what I do with the elements of ##V^q##? I map them to ##\mathbb{R}## isn't this precisely what a map does?
When we talk about a map ##V^n \to \mathbb{R}## what do you do with ##V^n##? Of course, you can view the action of mapping them as a "manipulation" but I don't see what is the problem.
As by say that to be able to do such a map we need to consider them covectors... I was introduced to covectors precisely as maps ##V^n \to \mathbb{R}## so are we supposed to need covectors to define covectors? Probably I'm not seeing something very trivial, but I don't understand your question.

I'm not an expert, but I'll try my best to answer your questions (although the answers have high probability to be wrong)

fresh_42 said:
• If we consider this ##(p,q)##-tensor as multilinear mapping into ##\mathbb{R}##, why isn't it a ##(p+q,0)##-tensor to begin with?
A ##(p,q)## tensor is a multilinear map ##V^{*p}\times V^q \to \mathbb{R}## i.e. it eats ##p## covectors and ##q## vectors and returns a number. A ##(p+q, 0)## tensor is a multilinear map ##V^{*(p+q)} \to \mathbb{R}## so it eats ##(p+q)## covectors and returns a number. Precisely because vectors and covectors are not the same thing, why should a ##(p,q)## tensor be a ##(p+q, 0)## tensor?

fresh_42 said:
• What happened to the vectors?
Well, I think this is the same question as before that I don't really understand. My best answer is... They get mapped, together with the covectors, to a real number.
fresh_42 said:
• Strassen's algorithm for matrix multiplication is a ##(2,1)##-tensor. Where does it end up in ##\mathbb{R}##?
Well, in the same way as for you what I call tensors aren't tensors, I could just say that such a thing is not a tensor. I think that, given a map ##T: V^2\to V## you can always create a tensor
$$\tilde{T}: V^*\times V^2 \to \mathbb{R}, \qquad \tilde{T}(\omega, v_1, v_2)=\omega(T(v_1,v_2))$$
In the other way, given an arbitrary map ##T: V^*\times V^2 \to \mathbb{R}## you can define a map
$$\tilde{T}: V^2 \to V, \qquad \tilde{T}(v_1, v_2)= T(e^i, v_1, v_2) e_i$$
with ##e_i## a base of ##V## and ##e^i## the dual base, and sum over ##i## is implicit.

fresh_42 said:
• A vector isn't a function. If we treat it as such, then we must do something, namely setting ##(V^*)^* =V## or something else, e.g. evaluating a differential form at a certain point. These cannot be viewed as automatisms. We actively turn a vector into something else, mostly into ##v \leftrightsquigarrow (w \longmapsto \langle w,v \rangle).## And if there is no inner product?
Again, probably just a silly question, but why do we need to consider that vectors are functions?

The last question (which isn't actually a question) is far from my understanding, so maybe we should concentrate on finite dimensions?

Sorry if my questions are silly or I'm confounding completely different concepts.

fresh_42 said:
A tensor is a multilinear function,
Fresh, you seem to be thinking of a different multilinear function from what the rest of us are.

I am using the definition of a (m n) tensor over vector space V and field F as a multilinear function from ##V^m\times (V^*)^n## to ##F##.

Given the quote above, you appear to be viewing it as a multilinear function, but a different one from me. You appear to consider the domain of the function to be ##V^m## but you have not explicitly stated the range. I am imagining you see the range as something like what we get if we successively reduce the matrix representation of T by each of the m input vectors. Hence the need for care in an infinite-dimensional space. The range of such a function might be ##(V^{**})^n##, which will not in all cases be the same as ##V^n##. I get the sense that it is those complexities that are concerning you.

What I don't get is why you feel the need to bother with those complexities at all, since they only arise from treating the tensor as a function with domain ##V^m##, rather than as a function with domain ##V^m\times (V^*)^n##.

I think you created this problem by in your first post defining ##T_i## as a product of vectors and covectors, ie
fresh_42 said:
We have
$$T_i=\underbrace{v_1\otimes v_2\otimes \ldots\otimes v_{r_i}}_{r_i\text{ many vectors}}\otimes \underbrace{v^*_1\otimes v^*_2\otimes \ldots\otimes v^*_{s_i}}_{s_i\text{ many covectors}}\quad (i=1,2)$$
rather than as a function that takes vectors and covectors as inputs and delivers a scalar, which is what the OP did.

If you define it your way then you can't assume that the tensor eats covectors, but if you use the OP's (and my, and martinbn's) definition you can.

Since all the objections seem to arise from the above formula, they fall away when we reject it.

Both definitions can work and can be shown to be equivalent, up to isomorphism. They appear as respectively the 2nd and 3rd definitions in the wiki article on tensors. But the OP is using the definition as multilinear map (with scalar range), so it just introduces unnecessary confusion to use a different definition.

fresh_42 said:
Because the definition says ##(r,s)##-tensor. This means ##r## vectors and ##s## covectors.

I am complaining that you basically claim it is a ##(0,r+s)##-tensor, which it is not.
I don’t think anyone was saying that a vector is the same as a covector. I think what was being claimed was that a ##(r,s)## tensor can be thought of as a function that takes ##s## vectors and ##r## covectors and returns a scalar.

So in this interpretation, a ##(1,0)## tensor is not literally a vector; it is an element of ##V^{**}## not an element of ##V##

weirdoguy and andrewkirk
We may have to end up appealing to Bill Clinton's (in)famous phrase " It depends on what is is". You can look at, e.g., The Real numbers from many perspectives as they have many different types of structures: 1-dimensional Euclidean Space, a Field, a Vector Space over itself, etc. and none of these somehow prevails over the others.

fresh_42 said:
I think you should write a paper.

I think there is no need for that, since there are already a lot of textbooks on linear algebra that states what you say is "wrong". E.g. Introduction to Algebra, volume 2, by Alexei Kostirkin. And I would say it's a basic knowledge, I learned it during my first year at university! I thought that it's the standard way to define tensors. It's weird that a mathematician does not know that.

Last edited:
The standard definition of tensor product of two vector spaces (perhaps infinite dimensional) is as follows. let ##E,F## be vector spaces (say over ##\mathbb{R}##) and let ##B(E,F)## be a space of bilinear functions ##f:E\times F\to \mathbb{R}##.

Define a mapping ##u_{xy}:B(E,F)\to \mathbb{R}## as follows
##u_{xy}(f)=f(x,y)## so that ##u_{xy}\in (B(E,F))^*.## We have also got a bilinear mapping
$$\chi:E\times F\to (B(E,F))^*,\quad \chi(x,y)=u_{xy}.$$
By definition the tensor product ##E\otimes F## is the linear span of $$\chi(E\times F);\quad x\otimes y:=u_{xy}.$$

The main feature is as follows. Any bilinear function ##A:E\times F\to W## (W is some other vector space) can be presented as follows ##A=\tilde A\chi ##
here ##\tilde A:E\otimes F\to W## is a linear mapping.

Last edited:
GR191511
wrobel said:
The standard definition of tensor product of two vector spaces (perhaps infinite dimensional) is as follows. let ##E,F## be vector spaces (say over ##\mathbb{R}##) and let ##B(E,F)## be a space of bilinear functions ##f:E\times F\to \mathbb{R}##.

Define a mapping ##u_{xy}:B(E,F)\to \mathbb{R}## as follows
##u_{xy}(f)=f(x,y)## so that ##u_{xy}\in (B(E,F))^*.## We have also got a bilinear mapping
$$\chi:E\times F\to (B(E,F))^*,\quad \chi(x,y)=u_{xy}.$$
By definition the tensor product ##E\otimes F## is the linear span of $$\chi(E\times F);\quad x\otimes y:=u_{xy}.$$

The main feature is as follows. Any bilinear function ##A:E\times F\to W## (W is some other vector space) can be presented as follows ##A=\tilde A\chi ##
here ##\tilde A:E\otimes F\to W## is a linear mapping.
Thanks

In a general sense, the tensor product of two vector spaces ##V, W ## over the same field is a third vector space ##V \otimes W##, whose dimension is the product of those of ##V, W ##and so that every bilinear map from ## V \times W \rightarrow Z ## , becomes a linear map from ## V\otimes W \rightarrow Z ##, for ##Z## any vector space.
Idea is to transform multilinear maps into linear ones, as the latter are simpler and easier to deal with.

Last edited:
wrobel and weirdoguy

• Linear and Abstract Algebra
Replies
10
Views
126
• Linear and Abstract Algebra
Replies
4
Views
2K
• Linear and Abstract Algebra
Replies
2
Views
1K
• Linear and Abstract Algebra
Replies
1
Views
1K
• Linear and Abstract Algebra
Replies
5
Views
2K
• Linear and Abstract Algebra
Replies
1
Views
2K
• Linear and Abstract Algebra
Replies
8
Views
1K
• Linear and Abstract Algebra
Replies
2
Views
1K
• Linear and Abstract Algebra
Replies
1
Views
3K
• Linear and Abstract Algebra
Replies
2
Views
2K