View Tensors as Multi-Variable Functions?

  • Thread starter blashmet
  • Start date
  • Tags
    Tensors
In summary, tensors can be viewed as multi-variable functions, with each argument being a vector or dual vector. All tensors are multi-linear functionals, meaning they are linear in each argument. The arguments of tensors can include both vectors and dual vectors. However, tensors must be linear in their arguments and cannot be non-linear. Additionally, tensors can be defined as elements of tensor spaces or as multilinear functions on a manifold.
  • #1
blashmet
15
0
Is it correct to view tensors as multi-variable functions? For example, it seems the permutation tensor is a function of three variables and the metric tensor is a function of two variables. Of course, these "functions" turn into constants when i,j, and k (the indices) are known, but it seems like these constants can be viewed as the output of the function.So, am I viewing tensors correctly? Thanks!
 
Physics news on Phys.org
  • #2
Well, a tensor is a multi-linear functional on vector and dual spaces, and it can take multiple arguments so I suppose in that sense it can be tought of as a multi-variable function as long as one realizes that each of the "variables" are vectors or dual vectors.
 
  • #3
Are all tensors multi-linear functionals? Do the arguments of all tensors have to be vectors? That is, are these necessary conditions of being a tensor? Thanks for the help!
 
  • #4
a k-tensor is a function of k vector variables ( a multi linear one, that is ). So yes, each of the arguments must be vectors
 
  • #5
Can you have a tensor that isn't multi-linear? Thanks!
 
  • #6
I just would like to add that the arguments don't have to be all vectors (contravariant). They can be dual vectors (covariant) as well.

As to your last question, you can have a rank (0,1) tensor that takes a single vector as input. Similarly, you can have a rank (1,0) tensor that takes a single dual vector as input. However, all tensors must be linear in their arguments.
 
  • #7
hi blashmet! :smile:
blashmet said:
… Do the arguments of all tensors have to be vectors? That is, are these necessary conditions of being a tensor?

it's easiest to define a tensor in terms of its effect on "inputting" individual vectors,

but the "input" can be any tensor …

for example the metric gij can have any tensor Aijklm as "input" :wink:
blashmet said:
Can you have a tensor that isn't multi-linear? Thanks!

no!

but although it's multi-linear, it isn't necessarily linear in each vector, see the pf library on tensor …

"Non-linear" tensors:

The equation [itex]a^i\ =\ T^i_{\ jk}b^jc^k[/itex] is linear in both b and c (meaning that a linear transformation on b or c induces a linear transformation on a).

But the equation [itex]a^i\ =\ T^i_{\ jk}b^jb^k[/itex] is not linear in b.

For example, in "non-linear optics", there are "higher-order" susceptibility tensors, each with the electric field vector as the only input (exactly as in the linear case), but with that input repeated one or more times:

[itex]\frac{1}{\varepsilon_0}P^i\ =\ \chi^i_{\ j}E^j\ +\ \chi^{(2)i}_{\ \ \ jk}E^jE^k\ +\ \chi^{(3)i}_{\ \ \ jkl}E^jE^kE^l\ +\ \cdots[/itex]​
 
  • #8
I'm real amateur in this, but I think only covariant tensors can be viewed as multilinear functionals, that is, only (0, n)-tensor. See wiki for more. You put n vectors in it and it spits out real number, that's how I understand it.

In the book that I'm reading, Lee's Introduction to smooth manifolds, when he says for example 2-tensor he means (0, 2)-tensor, so I guess covariant tensors are more important than contravariant(of type (n, 0)).
 
  • #9
in finite dimensions, V** = V where * means dual space. so you can define the other kind of tensor as a dual of the linear function kind.i.e. linear functions on a tensor space of V are the same as multilinear functions on V. Hence the dual of the multilinear functions on V are the tensor space of V. and the multilinear functions themselves are the tensor space of V*. I.e. the (0,2) tensors on V are the (2,0) tensors on V*, so if you know about V and V*, you only need to know about one type of tensors.there is also another more formal way to define tensors on V as elements

written as xoyoz, where x,y,z are in V, and satisfying rules like

(u+v)oyoz = uoyoz + voyoz, but when you get through the thing you have

is dual to the multilinear (trilinear in this case) functions on V.
 
  • #10
This is how I define "tensor" and "tensor field": (People often say "tensor" when they mean "tensor field").
Fredrik said:
A tensor at a point p in a manifold M is a multilinear function [itex]T:V^*\times\cdots\times V^*\times V\times\cdots\times V\rightarrow\mathbb R[/itex], where V is the tangent space at p and V* is the cotangent space at p. A tensor field is a function that assigns a tensor at p to each p.

This post
and the ones it link to explains the basics of manifolds, tangent and cotangent spaces, and the relationship between coordinate systems and bases.

Alesak said:
In the book that I'm reading, Lee's Introduction to smooth manifolds,
Excellent choice.

tiny-tim said:
it's easiest to define a tensor in terms of its effect on "inputting" individual vectors,

but the "input" can be any tensor …

for example the metric gij can have any tensor Aijklm as "input" :wink:
The metric g is a tensor field that assigns a tensor gp to each point p. The domain of gp is ##T_pM\times T_pM##, so the "input" of gp is always two tangent vectors at p. The input of g is a point p in the manifold.

If X and Y are vector fields, the notation g(X,Y) can be used for the map ##p\mapsto g_p(X_p,Y_p)##. So it would make sense to say that g takes two vector fields to a scalar field (i.e. a real-valued function defined on a subset of the manifold). But neither g nor gp can have "any tensor" as input.

##g_{ij}A^{ij}{}_{lm}## isn't the result of g taking A as input. It can denote either the tensor field ##g(\partial_i,\partial_j) A(\mathrm{d}x^i,\mathrm{d}x^j,\cdot,\cdot)## (abstract index notation) or its lm component ##g(\partial_i,\partial_j) A(\mathrm{d}x^i,\mathrm{d}x^j,\partial_l,\partial_m)##.
 
Last edited:

1. What are tensors in the context of multi-variable functions?

Tensors are mathematical objects that represent multi-dimensional arrays of numbers. In the context of multi-variable functions, tensors can be thought of as a way to represent and manipulate functions with multiple inputs and outputs. They allow for the analysis of the relationship between multiple variables and how they change together.

2. How are tensors used to represent multi-variable functions?

Tensors are used to represent multi-variable functions by assigning a specific dimension to each variable. For example, a function with two variables, x and y, would be represented by a two-dimensional tensor. The values within the tensor correspond to the output of the function at specific combinations of the input variables.

3. Why are tensors useful for viewing multi-variable functions?

Tensors are useful for viewing multi-variable functions because they provide a comprehensive and visual representation of the relationship between multiple variables. By organizing the variables into dimensions, it becomes easier to understand how changes in one variable affect the overall function and how the variables are related to each other.

4. Can tensors be used to solve problems involving multiple variables?

Yes, tensors can be used to solve problems involving multiple variables. They allow for the manipulation and analysis of multi-variable functions, making it easier to solve complex problems that involve multiple inputs and outputs. Tensors are commonly used in fields such as physics, engineering, and machine learning to solve real-world problems.

5. Do I need to have a strong mathematical background to understand tensors as multi-variable functions?

While a basic understanding of mathematics is helpful, it is not necessary to have a strong mathematical background to understand tensors as multi-variable functions. Tensors can be visualized and interpreted intuitively, making them accessible to those without advanced mathematical knowledge. However, a deeper understanding of linear algebra and calculus can aid in fully understanding and utilizing tensors in multi-variable functions.

Similar threads

  • Differential Geometry
Replies
3
Views
3K
  • Differential Geometry
Replies
6
Views
2K
Replies
2
Views
2K
Replies
10
Views
2K
  • Special and General Relativity
Replies
22
Views
2K
  • Linear and Abstract Algebra
Replies
5
Views
1K
  • Special and General Relativity
Replies
10
Views
2K
  • Differential Geometry
Replies
10
Views
2K
  • Special and General Relativity
Replies
17
Views
1K
  • Differential Geometry
Replies
4
Views
2K
Back
Top