Basic question on tensor analysis

  • #1
Hello. Could we approximate a tensor of (p,q) rank with matrices of their elements? I am talking also about the general case of a tensor not only special cases. For example a (2,0) tensor with i, j indices is a matrix of ixj indices. A (3,0) tensor with i,j,k indices I think is k matrices with ixj elements each one of them. So is a tensor of (p,q) rank with upper indices i1,i2,...,ip and lower indices j1,j2,...,jq a set of (i1xi2...xip)x(j1xj2...xjq-2) matrices of jq-1xjq elements? If I am incorrect what did I do wrong? Thank you.
 
Last edited:
  • Like
Likes Delta2

Answers and Replies

  • #2
12,226
5,925
What do you mean approximate a tensor?

A tensor of rank 0 is a scalar.

A tensor of rank 1 is a vector that could be viewed as an ordered set of scalars.

A tensor of rank 2 is a matrix that could be viewed as an ordered set of column vectors.

A tensor of rank 3 is a tensor (no particular name) that could be viewed as an ordered set of matrices.

However, thinking this way diminishes the full import of tensors. Tensors have subscript and superscript notation that represent covariant and contravariant attributes that are useful in differential geometry among other things.

https://en.wikipedia.org/wiki/Tensor





...
 
  • Like
Likes Delta2 and FactChecker
  • #4
etotheipi
Gold Member
2019 Award
2,925
1,878
A tensor of rank 1 is a vector that could be viewed as an ordered set of scalars.
Tiny quibble... a scalar is, like you point out, a rank-0 tensor, and is by definition coordinate independent. So the components of a vector w.r.t. some basis are not scalars, instead they are just numbers [which of course transform under change of coordinates].
 
  • Like
  • Haha
Likes Stephen Tashi and jedishrfu
  • #5
It is not necessary to approximate a tensor as i say but it may be a goal one has. But other goals are needed as i read about differential geometry and the use of tensors, they study it differently than i say. What i mean by approximation is that the tensor may be represented as the times i said before matrices. Having the components of those matrices will help solve my question in the general case. I think the components and matrices of a (p,q) rank tensor are so many that we need more papers or space to put them. Perhaps my question is wrong.
 
  • #6
Tiny quibble... a scalar is, like you point out, a rank-0 tensor, and is by definition coordinate independent. So the components of a vector w.r.t. some basis are not scalars, instead they are just numbers [which of course transform under change of coordinates].
They can not be functions? The vector with respect to a basis can not be like f=(f1,...,fn) where fi, ∀ i=1,..,n are functions of m variables where xj are those ∀ j=1,...,m with xj∈ℝ? They do not obey the transformation law? That is why?
 
  • #7
13,817
10,987
They can not be functions? The vector with respect to a basis can not be like f=(f1,...,fn) where fi, ∀ i=1,..,n are functions of m variables where xj are those ∀ j=1,...,m with xj∈ℝ? They do not obey the transformation law? That is why?
They can and they are. Take a derivative as example: ##f(x)=3x^2## and ##\left. \dfrac{d}{dx}\right|_{x=5}f(x)=30##. This is a number, a scalar, and a linear function, or even a covariant derivative. The latter two are the usual interpretation in differential geometry and physics. While the number ##30## is the slope of the tangent at ##x=5## at school, it becomes the linear function: ##30=D_5(f)\, : \,v\longrightarrow 30\cdot v## in differential geometry. This might be a bit confusing, but it is the one dimensional version of what we call e.g. a curvature tensor. So whether the value ##30## is considered a slope, a number, a scalar or a linear function depends on whom you ask, will say: the context.
 
  • Like
Likes jedishrfu
  • #8
56
24
They can not be functions? The vector with respect to a basis can not be like f=(f1,...,fn) where fi, ∀ i=1,..,n are functions of m variables where xj are those ∀ j=1,...,m with xj∈ℝ? They do not obey the transformation law? That is why?
If one has vector fields ##X_{i}## in some neighborhood that form a basis in each tangent space then any vector field in this neighborhood can be expressed as a linear combination of these basis fields. The coefficients of this linear combination will be functions on the neighborhood. So if ##V=∑_{i}a_{i}X_{i}## then the ##a_{i}## are functions(scalars). If one changes basis ##V## will be written using different functions. The same thing holds for any tensor.

If one has a vector field ##V## and a 1 form ##ω## then the value ##ω(V)## of ##ω## on ##V## is a function(scalar). Given the basis ##X_{i}## one has an n-tuple of 1 forms ##ω_{i}## that give the ##i##'th coefficient of any vector field in its linear expansion with respect to the basis fields. So for an arbitrary vector ##V##, ##V=∑_{i}ω_{i}(V)X_{i}##. The ##ω_{i}(V)## are functions for each vector field ##V##.
 
Last edited:
  • #9
368
58
What do you mean approximate a tensor?

A tensor of rank 0 is a scalar.

A tensor of rank 1 is a vector that could be viewed as an ordered set of scalars.

A tensor of rank 2 is a matrix that could be viewed as an ordered set of column vectors.

A tensor of rank 3 is a tensor (no particular name) that could be viewed as an ordered set of matrices.

However, thinking this way diminishes the full import of tensors. Tensors have subscript and superscript notation that represent covariant and contravariant attributes that are useful in differential geometry among other things.

https://en.wikipedia.org/wiki/Tensor





...
"Tensor is a thing that transform like a tensor" is definitely my favorite definition
 
  • Like
Likes PhDeezNutz and jedishrfu
  • #10
12,226
5,925
Yeah, recursive definitions often leave oneself in the dust.
 
  • #11
Stephen Tashi
Science Advisor
7,470
1,414
So the components of a vector w.r.t. some basis are not scalars, instead they are just numbers [which of course transform under change of coordinates].
Yes, but human endeavors permit contradictions, so an elementary physics text may speak of "resolving a vector into components" and draw the components as vectors. For example, the gravitational force on a mass resting on inclined plane can be represented a "component" perpendicular to the surface of the plane and a "component" parallel to the surface of the plane.
 
  • Like
Likes jedishrfu
  • #12
56
24
Tiny quibble... a scalar is, like you point out, a rank-0 tensor, and is by definition coordinate independent. So the components of a vector w.r.t. some basis are not scalars, instead they are just numbers [which of course transform under change of coordinates].
Actually the components are scalars. When you transform the tensor to a new coordinate system the new components are different scalars. Scalars are just functions defined on the manifold. Components of a vector field in a given coordinate system are functions.

For instance the components of a vector field(contravariant tensor of rank 1) in a given coordinate system form a n-tuple of functions on the coordinate domain. The most primitive examples are the coordinate functions ##x_{i}## themselves which are the projections of the coordinate chart onto the standard axes in Euclidean space.
 
Last edited:
  • Like
Likes etotheipi
  • #13
etotheipi
Gold Member
2019 Award
2,925
1,878
I thought, that ##\phi## is a scalar if for any two coordinate systems ##C## and ##C'##, $$\phi'(x') = \phi(x)$$that means, a scalar is invariant under a change of coordinate system. That definition is not satisfied by the components of a vector with respect to some basis, though.
 
  • #14
56
24
I thought, that ##\phi## is a scalar if for any two coordinate systems ##C## and ##C'##, $$\phi'(x') = \phi(x)$$that means, a scalar is invariant under a change of coordinate system. That definition is not satisfied by the components of a vector with respect to some basis, though.
The components in any given coordinate system are scalars. When one changes coordinates one gets a new set of scalars. If one considers all components in all coordinate systems together as a single object then one has the underlying tensor. But that is different.
 
  • Like
Likes etotheipi
  • #15
etotheipi
Gold Member
2019 Award
2,925
1,878
I think I see what you mean, thanks.
 
  • #16
56
24
I think I see what you mean, thanks. I delete!
No need to delete. Your comment leads to clarification of ideas. The index definition of tensors is not easy to master.
 
  • Like
Likes fresh_42
  • #17
wrobel
Science Advisor
Insights Author
617
364
Hello. Could we approximate a tensor of (p,q) rank with matrices of their elements? I am talking also about the general case of a tensor not only special cases. For example a (2,0) tensor with i, j indices is a matrix of ixj indices. A (3,0) tensor with i,j,k indices I think is k matrice
Yes you can consider tensor by means of matrices
that is completely useless by itself for understanding what tensor is
 
  • Like
Likes etotheipi and LCSphysicist

Related Threads on Basic question on tensor analysis

  • Last Post
Replies
1
Views
831
Replies
2
Views
1K
Replies
6
Views
1K
Replies
3
Views
1K
Replies
4
Views
1K
Replies
6
Views
2K
Top