Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Difference between Tensors and matrices

  1. Jan 10, 2015 #1
    They look a lot like matrices, and seem to work exactly like matrices. What is the difference between them? I have only worked with matrices, not tensors because I cant find a tutorial online but every time I have seen one they seem identical.
     
  2. jcsd
  3. Jan 10, 2015 #2

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    Rank 2 tensors can be represented by square matrices, but this does not make a tensor a matrix or vice versa. Tensors have very specific transformation properties when changing coordinates (in the case of Cartesian tensors, rotations).

    However, all tensors are not rank 2 and those that are not cannot be represented as a matrix (you would have to use a matrix with more than 2 dimensions). Also, not all matrices are tensors. There are non-square matrices, matrices not transforming in the proper way (a matrix is a priori only a rectangular array of numbers) to represent a tensor, etc. For many applications, you will only encounter tensors of rank 2 or lower and then representation with matrices is very convenient.
     
  4. Jan 10, 2015 #3

    FactChecker

    User Avatar
    Science Advisor
    Gold Member

    The matrix is a mathematical concept that does not have to transform when coordinates change the way a physical entity would. A tensor is a concept that must transform to new coordinates the way a physical entity would.
    Example: The identity matrix is a diagonal matrix of 1's. If the coordinate system is in feet or inches, the diagonals are still 1's. So the identity matrix is a math concept that does not transform correctly (from coordinates of feet to coordinates of inches) to represent a physical entity. For the same matrix to represent a tensor, it would have to be defined in a way that its diagonal 1's in the coordinates of feet would transform to either 12's or 1/12's diagonal elements in coordinates of inches (there are covarient and contravarient tensors)
     
  5. Jan 10, 2015 #4

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member

    While I agree that transformation properties of tensors are important, I think the unit matrix is not a very illuminating (and somewhat misleading) example. In particular, consider the (1,1)-tensor ##\delta^\alpha_\beta## such that ##\delta^\alpha_\beta V^\beta = V^\alpha##, where ##V## is a vector. This tensor will be represented by the unit matrix in all frames (the unit matrix is a transformation from the vector space of column matrices to itself and therefore naturally represents a (1,1)-tensor, you can fiddle around to make a square matrix represent an arbitrary rank-2 tensor, but I would say it is slightly less natural). The tensor transformation properties follow trivially from the chain rule.
     
  6. Jan 10, 2015 #5

    FactChecker

    User Avatar
    Science Advisor
    Gold Member

    Ok. I retract my statement and will stay out of this discussion.
     
  7. Jan 10, 2015 #6
    Usually tensors are associated with a linear vector space ##V## and its dual space ##V^*##. A tensor of rank ##(p,q)## is then a multilinear function from ##p## copies of ##V## and ##q## copies of ##V^*## to some scalar field (usually ##\mathbb{R}## or ##\mathbb{C}##). In this sense, a tensor is an element of ##V^{*p}\otimes V^{**q}##, where ##V^{**}## is the space of all linear functionals on ##V^{*}##.

    When ##V## is finite dimensional, ##V^{**}=V##, and a rank ##(p,q)## tensor is in ##V^{*p}\otimes V^{q}##. A linear transformation from ##V## to itself can be represented by an element ##\omega \in V\otimes V^*##. If we pick bases ##\epsilon_j## for ##V## and ##\epsilon_k^*## for ##V^*## (with ##\epsilon_k^*(\epsilon_j)=\delta_{kj}##), then we can expand ##\omega## as ##\omega = \sum_{j,k} \omega_{jk}\epsilon_j\otimes \epsilon_k^*##, and the components ##\omega_{jk}## can be interpreted as elements of a matrix.

    Matrices have a different kind of structure from tensors. While matrices can be used to represent tensors in a wide range of settings, matrix multiplication (say between square matrices) is only meaningful in the tensor context when the tensors have the form ##V\otimes V^{*}## for some vector space ##V##, or when there is a linear map between ##V## and ##V^*## (i.e. an inner product or metric) and ##p+q## is even (or if you consider multiplication between special families of tensors). Tensors have more structure than matrices, but questions about matrices have a very different flavor from questions about tensors.
     
  8. Jan 10, 2015 #7
    Can anyone link me a tutorial on tensors?
     
  9. Jan 10, 2015 #8
    What do you want to use tensors for? (e.g. general relativity, quantum mechanics, engineering/materials science, information theory/statistics)
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook