Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Several Dimensional Multiplication

  1. Sep 20, 2011 #1
    I've been googling but I haven't found anything useful. I am trying to understand how multiplying a matrix (or object of numbers - whatever you want to call it, like a matrix is 2D, a vector is 1D) of several dimensions works? I am writing some models for solving linear programs and it has a lot of summations which look very sloppy. I am hoping to represent things more elegantly. So how does multiplication of several-dimensional objects?

  2. jcsd
  3. Sep 21, 2011 #2
    So are you talking about a matrix in 7 dimensions or a 7 dimensional array of numbers? That's a little confusing... by a 7D matrix I mean a 7*7 array of numbers. By a 7 dimensional array of numbers I mean a n*n*n*n*n*n*n array of numbers for some positive integer n.

    A matrix is a linear object that eats a vector and returns a vector. So M.v = u can be written using indices as Mijvj = ui where the index j is summed over the dimension of the space, ie 1 to n, and the index i is free to be any value in the range 1 to n. See http://en.wikipedia.org/wiki/Einstein_summation_convention" [Broken])
    Equivalently a matrix is a multilinear object that eats two vectors and returns a scalar: u.M.v = uiMijvj.

    Note that linear means that M.(u + x*v) = M.u + x*M.v for x a scalar and u, v vectors. This is extended naturally to multilinearity.

    Higher order http://en.wikipedia.org/wiki/Tensor" [Broken] are linear objects that eat more vectors and return a scalar.
    They can be written with more indices Tijk... and are the higher-dimensional arrays that I mentioned above.

    (One thing I've not mentioned is the difference between contravariant and covariant indices, which is related to the vector space and its dual - but you probably don't need to worry about this.)

    When working with explicit realisations of tensors and matrices, sums over the indices become almost unavoidable. These sums can be optimised and given notational conveniences like the dot product in various computer languages such as Matlab, Mathematica, numpy, etc...

    And when working with them by hand, the Einstein (implicit) summation convention is very handy. As is various inner product and dot product notations.
    Last edited by a moderator: May 5, 2017
  4. Sep 21, 2011 #3
    I mean like a 7 dimensional array. So I was thinking a 7 dimensional "square" array (all dimensions same, so an n^7 array) multiplied by another 7 dimensional array of the same size (n^7) would give me a new 7 dimensional array. A 7 dimensional array by a 6 dimensional array might give my a new 6 dimensional array. A 7 dimensional array by a 1 dimensional array (vector) might give me a new vector.

    I think Tensor is what I was thinking. Do you know a good text (for someone with limited math experience) that gives all the properties the arithmetic of these? Or what class in a college would teach this? I took an intro linear algebra class a while ago but we didn't learn about these.
  5. Sep 21, 2011 #4
    I don't know of a good, easy text. Maybe you can find something in the math.stackechange question http://math.stackexchange.com/questions/10282/an-introduction-to-tensors" [Broken]. The wikipedia article is also not bad.

    Physicists and mathematicians often talk about tensors in very different ways. Depending on what you're doing, the different approaches will be more or less useful. The best way to think of them is as multilinear maps - most conveniently written using abstract index notation, which is simply made concrete (with explicit summation and/or integration) when need be.

    Depending on the metric in the space you're working in, you might need to worry about covariant and contravariant indices. These correspond to slots in the vector that eat vectors in the given vector space or its dual space. A nondegenerate metric gives an isomorphism between these spaces and if the basis chosen to be orthonormal, then the metric is proportional to the identity matrix and index position can be ignored.

    As for multiplying a couple of 7-index tensors, there are multiple ways of contracting the indices. The suggestions you made are all possible...

    Anyway, the wikipedia article http://en.wikipedia.org/wiki/Abstract_index_notation is pretty good, and googling for index notation primer turns up some decent results.
    Last edited by a moderator: May 5, 2017
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook