Math Newb Wants to know what a Tensor is

  • #51
62
0
May be this text can clearify the meanings of covariance and contravariance for the less mathematical people

http://www.mathpages.com/home/kmath398.htm [Broken]
 
Last edited by a moderator:
  • #52
jcsd
Science Advisor
Gold Member
2,090
11
If a quantity is covariant it means it transforms like the basis vectors, contrvriant means it traNsforms oppositely to basis vectors.
 
  • #53
mathwonk
Science Advisor
Homework Helper
10,780
953
look, if f:X-->Y is a differentiable map, and v is a tangent vector to X, and df is the derivative of f, then df(v) is a tangent vector to Y. this means v transforms "covariantly" in category language, i.e. in the same direwction as the map f, i.e.; from X to Y.

If on the other hand, q is a covector on Y, i.e. a function that sends a tangent vector on Y to a number, then df*(q) is a covector on X, as follows: if v is a tangent vector on X, then df*(q)(v) = q(df(v)). thus df* takes covectors on Y to covectors on X, i.e. covectors go in the oposite direction from the map f.

thats all there is to it.

of course the confusion is that in differential geometry, which i suppose means also in physics, the terminology is backwards.
 
  • #54
mathwonk said:
If on the other hand, q is a covector on Y, i.e. a function that sends a tangent vector on Y to a number, then df*(q) is a covector on X, as follows: if v is a tangent vector on X, then df*(q)(v) = q(df(v)). thus df* takes covectors on Y to covectors on X, i.e. covectors go in the oposite direction from the map f.
mathwonk,

i couldnt agree with you more. this well-defined behavior of differential forms with respect to pullback is one of the most intruiging and useful properties of forms. a more subtle point is also that these are still well defined even if the inverse mapping is not! tensors in general, do not enjoy this property.

i have nothing against the physicists way of doing math, but personally i find the terminology and way that they go about doing tensor analysis extremely confusing. when i was first learning this stuff (and am still learning it) i started by reading books that taught tensors the classical way: "something is a tensor if it transforms in blah blah way under a coordinate change blah blah". i did this for some time until i realized that this is an extremely confusing way of going about it...i then picked up frankel's and bishop's books on the subject and everything made perfect sense.

i think that one of the main hurdles is that in classical tensor analysis, ppl dealt with the components of a tensor rather than the tensor itself! this is still a major point of confusion in the subject. furthermore, why bother saying that a tensor is something that changes in a certain way with coordinate change, when it is nearly always impractical to work in terms of coordinates anyway??
 
  • #55
62
0
gvk said:
First of all, covariant and contravariant vectors are not different vectors. They represent ONE VECTOR (an arrow :-) in two different coordinate systems (dual, or reciprocal, or skew, or...coordinates). The reciprocal system is equally satisfactory for representing vectors, but 'contravariant' vector looks exactly the same as 'covariant'..
This is correct.

Incidentally, when we refer to a vector (or, more generally, a tensor) as being either contravariant or covariant we're abusing the language slightly, because those terms really just signify two different conventions for interpreting the components of the object with respect to a given coordinate system, whereas the essential attributes of a vector (or tensor) are independent of the particular coordinate system in which we choose to express it. In general, any given vector or tensor can be expressed in both contravariant and covariant form with respect to any given coordinate system
from http://www.mathpages.com/rr/s5-02/5-02.htm
 
  • #56
jcsd
Science Advisor
Gold Member
2,090
11
Peterdevis said:
No it isn't; as has been pointed out before so-called contravariant and covariant vectors belong to two different vector spaces (infact a covariant vector is infact a linear function of a cotravariant vector and vice versa).

IMHO this confusion arises because people are used to delaing with Cartesian tensors where the distinction is isn't obvious.

All your link is really saying is that there exists a certain one-to-one correpsondnace between a vector space and it's dual vector space that maps a vector to it's dual vector, this one-to-one corresponadance is called the metric and is a tensor of rank (0,2). Clearly identifying a pair of dual vectors as a single vector is incorrect.
 
Last edited:
  • #57
62
0
JSCD said:
All your link is really saying is that there exists a certain one-to-one correpsondnace between a vector space and it's dual vector space that maps a vector to it's dual vector, this one-to-one corresponadance is called the metric and is a tensor of rank (0,2)
No, it isn't! The problem is that most math people only see one to one correspondances and not what object tensors really are. Tensors are primally
object who are independant of the coördinatesystem.
When you talking about covariant or contravariant vectors, you talk about the components of the tensor, expressed in a certain basis. Both components and basis are coördinate dependent!
When you take "tangent vectors" as basic you call the vectorcomponents covariant because when you change the coördinate system the components transform by the covariant transformationrule (and the basic vectors bij the contravarianttransformation rule).
By not seeing that covariant and contravariant components describing the same object, you don't understand the power of the concept of tensors.
 
  • #58
jcsd
Science Advisor
Gold Member
2,090
11
What is this object they are describing? As I said earlier the only object I can see that they are describing is a pair of dual vectors, but from it's very name you can see that we think of this object as two differet objects!

These days we don't tend to define tensors by their tranformation rules, we instea dprefer to define them multilienar functions of vectors and covectors (vector = contravriant vector, covector = covaraint vector), so already in our definition of a tensor we've described vectors and covectors as different objects! Also by seeing them as the same object you may well miss out on the fact that covectors are linear functions which map a single vector to a number or that a vector is a linear function that maps a covector to a number.
 
  • #59
jcsd
Science Advisor
Gold Member
2,090
11
Note that when we raise or lower the index of a vector or a covector what we are really doing is multiplying metric by the vector/covector.
 
  • #60
6
0
Wow I read this entire thread and I fully don't understand tensors. I don't understand a lot of the notation (is that the word? :-) or even the application.

I'm 22 and come from more of a design background, I haven't really done any maths since I was about 15 but tonight I was hoping I'd be able to get the hang of "general relativity". (I was never bad at maths, btw, just impossibly slack.)

I have a solid grasp of 3d modelling/animation software as far as x,y,z verticies can animate along a timeline while being joined (via "vectors"?) to unlimited numbers of other verticies. Will this help me comprehend tensors?

Specific question, what do the little right pointing arrows signify in maths?

Cheers
You guys are amazing.
Pete

PS John Napier (1550 - 1617) is a direct great^x grandad of mine! Lol, and I've never studied logarithms until tonight.
 
  • #61
599
1
pete66 said:
Specific question, what do the little right pointing arrows signify in maths?
On top of a letter it signifies that this is a vector, but I assume that is not your question.

V-->W signifies a 'mapping' from V to W. This is another name for a transformation. It is usually a function that takes objects from a space V to W. In linear algebra for example the mapping is really a matrix multiplication.It brings one vector from a certain vector space (V) to another (W).
 
  • #62
Astronuc
Staff Emeritus
Science Advisor
18,706
1,720
"An nth-rank tensor in m-dimensional space is a mathematical object that has n indices and components and obeys certain transformation rules. Each index of a tensor ranges over the number of dimensions of space. However, the dimension of the space is largely irrelevant in most tensor equations (with the notable exception of the contracted Kronecker delta). Tensors are generalizations of scalars (that have no indices), vectors (that have exactly one index), and matrices (that have exactly two indices) to an arbitrary number of indices.

Tensors provide a natural and concise mathematical framework for formulating and solving problems in areas of physics such as elasticity [more generally constitutive models of materials], fluid mechanics, and general relativity."

from http://mathworld.wolfram.com/Tensor.html

all one ever wanted to know about tensors and then some. :biggrin:
 
  • #63
Astronuc said:
"An nth-rank tensor in m-dimensional space is a mathematical object that has n indices and components and obeys certain transformation rules. Each index of a tensor ranges over the number of dimensions of space. However, the dimension of the space is largely irrelevant in most tensor equations (with the notable exception of the contracted Kronecker delta). Tensors are generalizations of scalars (that have no indices), vectors (that have exactly one index), and matrices (that have exactly two indices) to an arbitrary number of indices.
:
i'm going to go out on a limb here and say that this definition is the whole problem with the way tensors are looked at...it doesn't really tell you anything meaningful. just saying that something follows certain transformation rules does not help you...

a tensor is a multilinear mapping, as follows:

[tex] T: V^* \times V^* \times V^* \times ... \times V \times V \times V ... \rightarrow R [/tex]

where the number of [tex]V^*[/tex]'s is some [tex]k[/tex], and the number of [tex]V[/tex]'s is some [tex]l[/tex], then we have a tensor of rank (k,l).

so it maps some combination of vectors and covectors into reals, and it does so in a multilinear way. so a [tex](2,2)[/tex] tensor, for instance, would be:

[tex]T(\alpha, \beta, \vec{v}, \vec{w}) = a_i b_i v_j w_j T(dx^{\alpha i}, dx^{\beta i}, \partial_{v j}, \partial_{w j})[/tex]

and let's look at the metric tensor:

[tex]G = g_{ij} dx_i \otimes dx_j [/tex]

let's feed it two vectors:

[tex] G(\vec{v}, \vec{w}) = g_{ij} dx^i(\vec{v}) dx^j(\vec{w}) = g_{ij} dx^i(v^i \partial_i) dx^j(w^j \partial_j) = g_{ij} v^i w^j dx^i(\partial_i) dx^j(\partial_j) = g_{ij} v^i w^j[/tex]

most importantly, and the whole point that is lost with the wolfram definition, is that [tex]g_{ij}[/tex] is NOT the metric tensor, it is the components of the metric tensor. The metric tensor is the whole expression for [tex]G[/tex].

I can prove it to you: [tex]G[/tex] is invariant under a coordinate change, that is it will always map you to the same values no matter what coordinates you are in. [tex]g_{ij}[/tex] on the other hand, will be different in various coordinate systems. look at the tensor [tex]G[/tex] and you will see that this _must_ be the case.

this is no different than confusing the components of a vector, with a vector itself! This is the problem with the so-called "classical definition": it defines a tensor in terms of what it's components do which is unnatural and quite frankly not very useful.
 
  • #64
62
0
The question is:


[tex]G = g_{ij} dx_i \otimes dx_j =g^{ij} \partial_{i} \otimes \partial_{j} ?[/tex]
 
  • #65
Peterdevis said:
The question is:


[tex]G = g_{ij} dx_i \otimes dx_j =g^{ij} \partial_{i} \otimes \partial_{j} ?[/tex]
let's see:

[tex] G^{-1}(\alpha, \beta) = g^{ij} \partial_i(\alpha) \partial_j(\beta) = g^{ij} \alpha(\partial_i) \beta(\partial_j) = g^{ij} v_i w_j = g^{ij} g_{ij} v^j w_j = v^j w_j = g_{ij} v^j w^i [/tex]

by symmetry of the inner product (by definition) however, [tex]g_{ij} = g_{ji} [/tex] so we can write:

[tex] g_{ij} v^j w^i = g_{ij} v^i w^j = G(\vec{v}, \vec{w}) [/tex]

so [tex] G^{-1}(\alpha, \beta) = G(\vec{v}, \vec{w}) [/tex]

which defines our relationship between vectors and covectors using this so-called metric tensor.
 
  • #66
62
0
jcsd said:
What is this object they are describing? As I said earlier the only object I can see that they are describing is a pair of dual vectors, but from it's very name you can see that we think of this object as two differet objects!
I don't think you have read the link http://www.mathpages.com/rr/s5-02/5-02.htm. When you look at the drawing you see the object P and his covariant en contravariant expression. So one object two expressions!
 
  • #67
Peterdevis said:
I don't think you have read the link http://www.mathpages.com/rr/s5-02/5-02.htm. When you look at the drawing you see the object P and his covariant en contravariant expression. So one object two expressions!
I believe that both yourself and jcsd are correct.

jcsd is saying that tensors are multilinear mappings that are coordinate independent, and that we can define everything in terms of linear functionals mapping to reals. this is true. infact we can even go so far as to say that [tex] \alpha [/tex] is the covector such that [tex] G^{-1}( . , \alpha) = G( . , \vec{v}) [/tex] is satisfied for some vector [tex] \vec{v} [/tex] (where the first arguments are fixed by bilinearity) and use the metric tensor as a linear transformation to the dual space.

Peterdevis is saying that the components and basis are very much coordinate dependent (by themselves), and that this is important to understand the transformation laws. this is also true.

Am I not mistaken here?

I will add one thing though: our basis vectors and the components of our covectors, even when used tensorially, are NOT the same even though they follow the same transformation laws. they live in different spaces altogether. example: we can pushforward the basis of our tangent vectors (as defined by our differential map or the Jacobian) but we cannot push forward the components of our covectors, we must use the pullback.
 
  • #68
jcsd
Science Advisor
Gold Member
2,090
11
That's what I am trying to say we're talking vectors that live in different vector spaces, so we really shouldn't see them as the same object even if there exists an importnat bijection between them (the metric tensor). Infact it's worth noting thta the scalr product isn't always defined, in these cases are we then discussing incomplete objects?

Peter Deis isn'tthat far off the mark he's just viewing tensors by the old definition. I have an old maths textbook thta tlaks aboiut the contravariant and covariant componets of a vector., but I can't say I liket that approach at all (especially as it talks about vector spaces and linear operators in a basis indepednt way, but when it comes to tensors it suddenly starts to define tem purely in terms of how the compoents change between basis, even though it notes that (some) linear operators are tensors).
 
  • #69
mathwonk
Science Advisor
Homework Helper
10,780
953
In my opinion, anybody who thinks that the properties of covariance or contravariance are not intrinsic properties of an object, is quite innocent of what is going on in differential geometry and manifold theory.

Perhaps they are confused by the phenomenon of a Riemannian metric, i.e. a smoothly varying dot product on the tangent bundle, since if one has a dot product, one can artificially represent a vector as a covector, by dotting with it. But then it is a different object. I.e. the operation of dotting with a vector is not the same object as the vector it self.

As to convincing such people:

They said it couldn't be done.
They laughed when I said I would do it.
They said that it couldn't be done.
I rolled up my sleeves and went to it.

I struggled, I strove, I strained.
I fought at it day and night.
They said that it couldn't be done.
They were right.
 
  • #70
jcsd said:
I have an old maths textbook thta tlaks aboiut the contravariant and covariant componets of a vector., but I can't say I
jcsd,

i also have many older books that explain tensors this way, and have also been puzzled as to why they would want to approach the subject in this manner. as mathwonk pointed out, the real power of differential geometry will remain a mystery in that context.

it is also interesting to note that differential forms will be well-defined with respect to pullback, even if the inverse mapping isn't. tensors in general (and certaintly not vectors) do not enjoy this property, they are entirely dependent upon the mapping to be bijective. so certain topological mappings (such as irreversible deformations) can still be modelled using forms - this is awesome, and something that i am trying to learn more about now.
 
  • #71
mathwonk
Science Advisor
Homework Helper
10,780
953
the point quetzalcoatl9 is making is extremely important, and leads to the powerful invariant called de rham cohomology. i.e. suppose we have a vector assigned to each point of X, i.e. a vector field on X, and a smooth mapf from X to Y. if we want to assign a corresponding vector field on Y via the map f, we are out of luck. i.e. take a point y of Y. we want to push over some vector associated to a point of X. but what point do we choose? there is no way to say. we have the same problem with a covector field on X.

but if we have a "covector field" s on Y the story is quite different. i.e. to each point of Y we have assigned a linear function on tangent vectors. now to assign such a covector field on X is easy. if x is a point of X and v is a tangent vector at x, then to evaluate our covector field on v, just apply the derivative (or differential) of f to v and push it over to a tangent vector to Y at f(x). this number is defined to be the value of the pullback covector field, f*(s) at v.

Then we can show also that exterior derivative commutes with pullback. Now define the de rham cohomology group of dergee 1 to be the space of differentials s such that ds = 0, modulo thiose which are of form dg for some fucntion g. then the remarks above show that any smooth map f:X-->Y induces a linear map from the de rahm group of Y to that of X.


thus we have a functor on manifolds. the existence of this functor depends on the fact that covector fields are "contravariant" in category language, or "covariant" in differential geometry language, in either case the point is that they go in the opposite direction to the original map.

it should be obvious to anyone reading this discussion (the choir?) that covariance and contravariance are clearly intrinsic and absolutely essential properties of the objects being discussed.

the existence of this functor has powerful consequences.

for example, by stokes theorem, the de rham cohomology of a disk is zero, and since the form dtheta is non zero on a circle, the de rham cohomology of a circle is not zero. this implies the brauer non retraction theotrem as follows:

if there were a smooth map from the disc onto its boundary, leaving the boundary fuxed, it would induce the identity map of de rham cohomology group of the circle, which also factored through the azero group of the disc, an absurdity.

this then implires the brauer fixed point theorem for the disc.

a similar argument with the solid angle form implies that the antipodal map of the sphere cannot be deformed smoothly nito that identity map of the sphere and this implies that there are no smooth vector fields on the sphere having no zero vectors anywhere.

the subject goes on and on and on..... to say whole books have been written about it, is a huge understatement, such as bott - tu, on differential forms, recapturing large parts of algebraic topology via de rham cohomology.

when I was young and energetic, I taught the stuff on vector fields on spheres and brauer theorems, in my sophomore several variables calculus class at central washignton state college.
 
Last edited:
  • #72
jcsd
Science Advisor
Gold Member
2,090
11
I don't really have much to add, except to say that I've found the discussions I've had with mathwonk and his posts on this subject extremely enlightening vis a vis tensors.
 
  • #73
mathwonk
Science Advisor
Homework Helper
10,780
953
jcsd, you made my day.

best wishes

mathwonk
 
  • #74
35
1
Tensors

chroot (Warren)....

That was a wonderful explanation of Tensors. I appreciated the time you took to write it. I did not have the benefit of learning about tensors in school. It's just not something they taught when I was learning about physics. Since I've seen the notation and even knew a few tidbits of info, but that's about it. At some point I decided to just teach myself. Something scientists are good at doing, most of the time, in their careers to help fill in the gaps left by formal education. This lacking always made me feel like I'm missing something. I'm hoping to parse off some time so that I can pursue this segue of learning, even tho I do not use it in my work.

Your explanation is clear and concise. Makes me wonder how far ahead I would have been had I had more teachers like you in school!

Thanks again......many accolades to you!

:)




chroot said:
Even more generally, a tensor is a sort of mathematical machine that takes one or more vectors and produces another vector or number.

A tensor of rank (0,2), often just called rank 2, is a machine that chews up two vectors and spits out a number.

A tensor of rank (0,3) takes three vectors and produces a number.

A tensor of rank (1,2) takes two vectors and produces another vector.

Hopefully you see the pattern.

You actually already know what a (1,1) tensor is -- it's nothing more than a good ol' matrix. It accepts one vector and produces another vector.

If you're working in three dimensions, a (1,1) tensor can be represented by its nine components. Here's a simple (1,1) tensor.

[tex]
T = \left(
\begin{array}{ccc}
1 & 0 & 0\\
0 & 1 & 0\\
0 & 0 & 1
\end{array}
\right)
[/tex]

You already know what this guy does -- it takes a vector and gives you back the exact same vector. It's the identity matrix, of course. You would use it as such:

[tex]\vec v = T \vec v[/tex]

If the world were full of nothing but (1,1) tensors, it'd be pretty easy to remember what T means. However, there are many different kinds of tensors, so we need a notation that will help us remember what kind of tensor T is. We normally use something "abstract index notation," which sounds more difficult than it is. Here's our (1,1) tensor, our identity matrix, laid out in all its regalia:

[tex]T^a_b[/tex]

The a and b are referred to as indices. The one on the bottom indicates the tensor takes one vector as "input." The one of the top indicates it produces one vector as "output."

Tensors don't have to accept just vectors or produce just vectors -- vectors are themselves just a type of tensor. Vectors are tensors of type (0,1). In full generality, tensors can accept other tensors, and produce new tensors. Here are some complicated tensors:

[tex]
R^a{}_{bcd}\ \ \ \ G_{ab}
[/tex]

The second one, [itex]G_{ab}[/itex] is a neat one to understand. You should already understand from its indices that it is a type (0,2) tensor, which means it accepts two vectors as input and produces a number as output. It's called the metric tensor, and represents an operation you already know very well -- the dot product of two vectors!

In normal Euclidean 3-space, [itex]G_{ab}[/itex] is just the identity matrix. You can easily demonstrate the following statement is true by doing the matrix multiplication by hand:

[tex]\vec u \cdot \vec v = G_{ab} \vec u \vec v[/tex]

The metric tensor is more complicated in different spaces. For example, in curved space, it's certainly not the identity matrix anymore -- which means the vector dot product is no longer what you're used to either when you're near a black hole. Tensors are used extensively in a subject called differential geometry, which deals with, among other topics, curved spaces. General relativity, Einstein's very successful theory which explains gravity as the curvature of space, is cast in the language of differential geometry.

So there you have it: tensors are the generalization of vectors and matrices and even scalars. (Scalars, by the way, are considered to be type (0,0) tensors.)

I should mention that there not all mathematical objects with indices are tensors -- a tensor is a specific sort of object that has the transformation properties described by others in this thread. To be called a tensor, an object much transform like a tensor. Don't worry though, you're not going to run into such objects very often.

- Warren
 
  • #75
mathwonk
Science Advisor
Homework Helper
10,780
953
I agree that chroot's explanation is wonderful. it is the best clearest one i have ever seen. I love it and learned something from it immediately.

The other shoe: one reason for that is that it describes only such simple tensors that one does not need the concept of tensors to understand them.

if you really want to understand tensors, i sugest you try to understand the curvature tensor. this concept was apparently the reason for the invention of tensors and the first tensor invented by riemann.

(another entry in the thread that can never die.)
 
Last edited:

Related Threads for: Math Newb Wants to know what a Tensor is

  • Last Post
Replies
22
Views
1K
  • Last Post
3
Replies
68
Views
16K
  • Last Post
Replies
13
Views
10K
  • Last Post
Replies
7
Views
4K
Replies
3
Views
5K
Replies
11
Views
4K
  • Last Post
Replies
11
Views
2K
Replies
2
Views
9K
Top