Several Dimensional Multiplication


by ScaryManInThePa
Tags: dimensional, multiplication
ScaryManInThePa
ScaryManInThePa is offline
#1
Sep20-11, 09:36 PM
P: 3
I've been googling but I haven't found anything useful. I am trying to understand how multiplying a matrix (or object of numbers - whatever you want to call it, like a matrix is 2D, a vector is 1D) of several dimensions works? I am writing some models for solving linear programs and it has a lot of summations which look very sloppy. I am hoping to represent things more elegantly. So how does multiplication of several-dimensional objects?

Thanks
Phys.Org News Partner Science news on Phys.org
SensaBubble: It's a bubble, but not as we know it (w/ video)
The hemihelix: Scientists discover a new shape using rubber bands (w/ video)
Microbes provide insights into evolution of human language
Simon_Tyler
Simon_Tyler is offline
#2
Sep21-11, 12:17 AM
P: 313
So are you talking about a matrix in 7 dimensions or a 7 dimensional array of numbers? That's a little confusing... by a 7D matrix I mean a 7*7 array of numbers. By a 7 dimensional array of numbers I mean a n*n*n*n*n*n*n array of numbers for some positive integer n.

A matrix is a linear object that eats a vector and returns a vector. So M.v = u can be written using indices as Mijvj = ui where the index j is summed over the dimension of the space, ie 1 to n, and the index i is free to be any value in the range 1 to n. See Einstein summation convention. This clearly works for any integer dimension n. (And can be extended to other types of linear space - see abstract index notation)
Equivalently a matrix is a multilinear object that eats two vectors and returns a scalar: u.M.v = uiMijvj.

Note that linear means that M.(u + x*v) = M.u + x*M.v for x a scalar and u, v vectors. This is extended naturally to multilinearity.

Higher order tensors are linear objects that eat more vectors and return a scalar.
They can be written with more indices Tijk... and are the higher-dimensional arrays that I mentioned above.

(One thing I've not mentioned is the difference between contravariant and covariant indices, which is related to the vector space and its dual - but you probably don't need to worry about this.)

When working with explicit realisations of tensors and matrices, sums over the indices become almost unavoidable. These sums can be optimised and given notational conveniences like the dot product in various computer languages such as Matlab, Mathematica, numpy, etc...

And when working with them by hand, the Einstein (implicit) summation convention is very handy. As is various inner product and dot product notations.
ScaryManInThePa
ScaryManInThePa is offline
#3
Sep21-11, 01:07 AM
P: 3
I mean like a 7 dimensional array. So I was thinking a 7 dimensional "square" array (all dimensions same, so an n^7 array) multiplied by another 7 dimensional array of the same size (n^7) would give me a new 7 dimensional array. A 7 dimensional array by a 6 dimensional array might give my a new 6 dimensional array. A 7 dimensional array by a 1 dimensional array (vector) might give me a new vector.

I think Tensor is what I was thinking. Do you know a good text (for someone with limited math experience) that gives all the properties the arithmetic of these? Or what class in a college would teach this? I took an intro linear algebra class a while ago but we didn't learn about these.

Simon_Tyler
Simon_Tyler is offline
#4
Sep21-11, 04:07 AM
P: 313

Several Dimensional Multiplication


I don't know of a good, easy text. Maybe you can find something in the math.stackechange question An Introduction to Tensors. The wikipedia article is also not bad.

Physicists and mathematicians often talk about tensors in very different ways. Depending on what you're doing, the different approaches will be more or less useful. The best way to think of them is as multilinear maps - most conveniently written using abstract index notation, which is simply made concrete (with explicit summation and/or integration) when need be.

Depending on the metric in the space you're working in, you might need to worry about covariant and contravariant indices. These correspond to slots in the vector that eat vectors in the given vector space or its dual space. A nondegenerate metric gives an isomorphism between these spaces and if the basis chosen to be orthonormal, then the metric is proportional to the identity matrix and index position can be ignored.

As for multiplying a couple of 7-index tensors, there are multiple ways of contracting the indices. The suggestions you made are all possible...

Anyway, the wikipedia article http://en.wikipedia.org/wiki/Abstract_index_notation is pretty good, and googling for index notation primer turns up some decent results.


Register to reply

Related Discussions
Dimensional reduction of the 4 dimensional Supersymmetry Beyond the Standard Model 0
Is anything on Earth--- 4 dimensional? 2 dimensional? 1 dimensional? General Physics 12
velocity of 2-dimensional and 3-dimensional waves Classical Physics 3
infinite-dimensional matrix multiplication Linear & Abstract Algebra 8
The 3-Dimensional/4-Dimensional Hypotheisis Of The Universe General Physics 6