Tensor rank: One number or two?

  • I
  • Thread starter George Keeling
  • Start date
  • Tags
    rank Tensor
In summary: You're just moving indices around.which looks like the tensor transformation law for a change of basis. No wonder I think, so wrongly, that covector / vector bases are just like different coordinate bases.I don't understand what you're trying to say.
  • #1
George Keeling
Gold Member
173
41
TL;DR Summary
Is it necessary to have two numbers to specify the rank of a tensor?
When I started learning about tensors the tensor rank was drilled into me. "A tensor rank ##\left(m,n\right)## has ##m## up indices and ##n## down indices." So a rank (1,1) tensor is written ##A_\nu^\mu,A_{\ \ \nu}^\mu## or is that ##A_\nu^{\ \ \ \mu}##? Tensor coefficients change when the indices move up or down but surely the tensor itself stays the same. Can I forget about the ##\left(m,n\right)## business and just need the rank of a tensor which is ##m+n## = the total number of indices?
 
Mathematics news on Phys.org
  • #2
You shouldn't ignore it, because the vertical positioning tells you how many vector and covector arguments the tensor takes.
George Keeling said:
Tensor coefficients change when the indices move up or down but surely the tensor itself stays the same.
Hmm I don't think that's right. When you 'raise and lower indices' with the metric tensor, you do end up with a completely different tensor which lives in a different tensor space. For example, take a vector ##A = A^k e_k##, and now take the tensor product with the metric tensor and contract over the last two slots:$$\tilde{A} = \mathscr{C}(2,3) [g \otimes A] = \mathscr{C}(2,3) [ g_{ij} A^k e^i \otimes e^j \otimes e_k] = g_{ij} A^k e^i \delta^j_k = g_{ij} A^j e^i \equiv \tilde{A}_i e^i$$We see that ##\tilde{A}_i = g_{ij} A^j##, however note that ##A \in V## whilst ##\tilde{A} \in V^*##. So ##A## and ##\tilde{A}## are completely different objects, although they are put into one-to-one correspondence via the metric tensor ##g## [i.e. we can define an isomorphism, as above]. In that way, we can abuse notation a little bit and simply write ##A_i = g_{ij} A^j##. And you can extend the exact same reasoning to a general ##(m,n)## tensor, which is put into correspondence with ##2^{m+n}-1## other tensors via the metric.

(N.B. This is not the same as performing a change of basis of a given tensor space, which would indeed give a different representation of the same tensor.)
 
Last edited by a moderator:
  • Like
Likes George Keeling
  • #3
George Keeling said:
Can I forget about the ##\left(m,n\right)## business and just need the rank of a tensor which is ##m+n## = the total number of indices?
That depends on what you want to do. A tensor is composed of vectors and linear forms: ##V\otimes\ldots\otimes V\otimes V^*\otimes\ldots\otimes V^*##. The linear forms are linear functions, i.e. they can be fed with vectors and spit out a scalar. ##\operatorname{rank} (m,n)## says how many copies of ##V## resp. ##V^*## are there. From a purely mathematical point of view they are all vector spaces and even of the same dimension, so it doesn't make a lot of sense to distinguish them if you only consider vectors. This is completely wrong from a physical point of view, since vectors set up the space and covectors (= linear forms) combine to a function which eats vectors and creates a scalar, a number. Cp.

https://www.physicsforums.com/insights/what-is-a-tensor/
 
  • Like
Likes George Keeling
  • #4
etotheipi said:
You shouldn't ignore it, because the vertical positioning tells you how many vector and covector arguments the tensor takes.
surely then we need to know the order of the vector and covector arguments. Wouldn't fresh_42's ##V\otimes V\otimes V^*\otimes V^*## be different if it was ##V\otimes V^*\otimes V\otimes V^*##?
And this might be amusing:
Another thing that has been drilled into me is that in the equation for the line element $$
{ds}^2=g_{ij}{dx}^idx^j
$$##{dx}^i,dx^j## are basis dual vectors or convectors (and that ##{dx}^idx^j\neq dx^j{dx}^i##). Apologies for dx not dx. I suppose it must also be true that $$
{ds}^2=g^{kl}{dx}_k{dx}_l
$$From the post ##g_{ij}A^je^i\equiv{\widetilde{A}}_ie^i## it looks like I could lower the ##i## on the LHS to get $$
A^je_j={\widetilde{A}}_ie^i
$$Changing the ##e## 's to ##dx##'s that gives me$$
{\widetilde{A}}_i=\frac{{dx}_j}{{dx}^i}A^j
$$which looks like the tensor transformation law for a change of basis. No wonder I think, so wrongly, that covector / vector bases are just like different coordinate bases.:eek:
 
  • #5
George Keeling said:
surely then we need to know the order of the vector and covector arguments.
Yes. This seems to be conventional - for example in the case of the Riemann tensor ##R^a{}_{bcd}## you just need to know that the ##b## index is associated with the vector you are planning to transport around a loop defined by infinitesimal vectors with indices ##c## and ##d##.
George Keeling said:
that ##{dx}^idx^j\neq dx^j{dx}^i##
I don't think this is quite right. Using distinct vectors, ##U^iV^j=V^jU^i## - i.e., the order you write the tensors doesn't matter (unlike matrices). However, in general ##U^iV^j\neq U^jV^i##, because the ##i,j##th component of one tensor is the ##j,i##th component of the other. But in your example, ##U=V=dx##, so ##dx^idx^j=dx^jdx^i##. And also if you contract ##U^iV^j## or ##U^jV^i## with a rank-2 tensor the result may be the same - it only matters how you match up the indices. So if ##T_{ij}## is an arbitrary tensor, ##T_{ij}U^iV^j=T_{ji}U^jV^i\neq T_{ij}U^jV^i##.
George Keeling said:
I suppose it must also be true that $$
{ds}^2=g^{kl}{dx}_k{dx}_l
$$From the post ##g_{ij}A^je^i\equiv{\widetilde{A}}_ie^i## it looks like I could lower the ##i## on the LHS to get $$
A^je_j={\widetilde{A}}_ie^i$$
Yes.
George Keeling said:
Changing the ##e## 's to ##dx##'s that gives me$$
{\widetilde{A}}_i=\frac{{dx}_j}{{dx}^i}A^j$$
Remember that these are sums. Writing it out explicitly, ##A^je_j={\widetilde{A}}_ie^i## means ##\sum_jA_je^j=\sum_i{\widetilde{A}}_ie^i##, so you cannot divide both sides by ##e_i## as you did.

Getting back to your original question, in general relativity (which I think is your main interest here) you always have a metric available on any manifold of interest. That means that you can always raise or lower an index, so there isn't really any extra information in a vector that isn't encoded in its dual. So which you use is primarily a matter of computational convenience. You need to keep track of the raised and lowered indices and their orders so that you know that the calculation is legal, but if you have a raised index and it would be more convenient to work with a lowered index, you just lower it. As I understand it, that isn't generally the case in differential geometry (you can have manifolds without metrics), but it works for differential geometry as applied to GR.

Edit: Had a bit of a LaTeX nightmare with this one. I think I've fixed everything, but if I'm misquoting you somewhere it's because I didn't do the fixing right.
 
Last edited:
  • Like
Likes George Keeling and etotheipi
  • #6
@Ibix, from the equation ##g_{ij} A^j e^i = \tilde{A}_i e^i##, does it makes sense to lower the index of the ##e^i##? I'm uncertain if that makes sense geometrically. For one thing, ##A^j e_j \in V## whilst ##\tilde{A}_i e^i \in V^*##, so these last two things cannot be equal.
 
Last edited by a moderator:
  • Like
Likes Ibix
  • #7
etotheipi said:
@Ibix, from the equation ##g_{ij} A^j e^i = \tilde{A}_i e^i##, does it makes sense to lower the index of the ##e^i##? I'm uncertain if that makes sense geometrically. For one thing, ##A^j e_j \in V## whilst ##\tilde{A}_i e^i \in V^*##, so these last two things cannot be equal.
Good point - I was thinking of ##e^i## as vector components, but if it's actually meant to be the ##i##th basis then there's another reason you can't divide by it.
 
  • Like
Likes etotheipi
  • #8
Ibix said:
Good point - I was thinking of ##e^i## as vector components, but if it's actually meant to be the ##i##th basis then there's another reason you can't divide by it.

Ah yeah, sorry, I should really have typeset it in bold or included a little hat or something because now that I look at it, it does look a bit like a vector component 🤦‍♂️
 
  • #9
You explicitly stated it by noting that ##A=A^ke_k##. But I read your reply this morning and George's this afternoon and I forgot the context. o:) I should read more carefully.
 
  • #10
Ibix said:
you can have manifolds without metrics
eek! I suspected that there might be such things but I had never seen it written in black and white. It explains a lot that has seemed mysterious. My zone of ignorance has expanded. o:) Egg on my face too for dividing by ##e^i##.
Thanks very much all for your help.
 

1. What is tensor rank?

Tensor rank is a mathematical concept used to describe the number of dimensions or modes of a tensor. It is similar to the concept of matrix rank, but applies to higher-dimensional arrays.

2. Is tensor rank represented by one number or two?

The answer depends on the type of tensor. For a regular tensor, the rank is represented by a single number. However, for a symmetric tensor, the rank is represented by two numbers - the rank of the tensor and the rank of its symmetry group.

3. How is tensor rank calculated?

The tensor rank is calculated by finding the minimum number of linearly independent components needed to represent the tensor. This can be done using various methods such as singular value decomposition or tensor decomposition algorithms.

4. What is the importance of tensor rank in data analysis?

Tensor rank is important in data analysis as it helps in understanding the complexity and structure of high-dimensional data. It can also be used for data compression and feature extraction, making it a useful tool in machine learning and data mining.

5. Can tensor rank change?

Yes, the tensor rank can change depending on the operations performed on the tensor. For example, adding or removing dimensions, or applying transformations can change the rank of a tensor. However, the rank of a symmetric tensor remains the same even after such operations.

Similar threads

  • General Math
Replies
1
Views
1K
  • Special and General Relativity
Replies
5
Views
976
  • Special and General Relativity
Replies
22
Views
2K
  • General Math
Replies
6
Views
999
Replies
2
Views
514
Replies
2
Views
1K
  • Calculus
Replies
1
Views
962
  • Special and General Relativity
Replies
8
Views
2K
  • Differential Geometry
Replies
2
Views
900
  • Advanced Physics Homework Help
Replies
1
Views
2K
Back
Top