# Three-way arrays

1. Sep 14, 2004

### jcsd

Can some describe to me the basic algebra of three-way arrays?

2. Sep 15, 2004

### matt grime

Look up three dimensional tensors, since that is probably what you want.

3. Sep 15, 2004

### jcsd

yes, I assume a 3-way array is a tensor of rank 3 (providied that it has n rows, n columns and n whatever they call them) infact I'm sure it must be. But what I was wondering is as that as a square matrix is basically a mixed tensor of rank 2 (provided that you define column matrices as vectors and row matrices as one-forms or vice versa), what are the covaraint and contravariant orders of a three-way array? Or is that entirely down to the way you define them (for example it's possible to represent a covaraint tensor of rank two, such as the metric tensor, as a square matrix as long as you realize that it's inner product with a vector is not another vector, but a one-form)?

4. Sep 17, 2004

### mathwonk

how interesting. it is amazing how many puzzles arise when you think of tensors as arrays of numbers instead of conceeptual objects. i.e. a 3 dimensional array is just a rectangular block of numbers. but what does it mean? which are the rows and which are the columns?

notice that the problem already arises in two dimensions. i.e. as you (essentially) point out, a square matrix is actually not a tensor of rank either (0,2) nor (2,0), but a tensor of rank (1,1), since it naturally takes a column vector, i.e. a vector, to another column vector, and takes a row vector to another row vector. so if you want a rank (0,2) or (2,0) object, you have to take the transpose of the result of multiplying to get the right thing.

so how do you accurately represent a tensor of rank (2,0) or (0,2)? I guess it is a sequence of row vectors, or a vertical array of column vectors! It is too complicated for me.

why not just say what they are? i.e. sections of the tensor product of 2 tangent spaces or two cotangent spaces? or equivalently, linear maps from the tangent or cotangent space to the cotangent or tangent space. i.e. Hom(T,T*) or Hom(T*,T).

In general, all tensors of all ranks are representable as just multilinear maps from some product of tangent and cotangent spaces to scalars.

I.e. Multihom(TxTxTx...xT x T*xT*x.....xT*, scalars)

the rank is (r,s) if there are s copies of T and r copies of T*. (i hope).

"they laughed when i sat down at the piano, but they actually cried when i played."

Last edited: Sep 17, 2004
5. Sep 17, 2004

### matt grime

I think that I should have made explicit that the thrust of my answer was just to say that 'arrays' don't have any algebra naturally associated to them, it's just a set indexed by some variables. I can define an addition on arrays, or scalar multiplication, or even 'multiplication' of arrays, but that does't mean that the assignment means anything.

6. Sep 18, 2004

### mathwonk

this poses an ineteresting challenge, can someone propse a generalized version of a matrix to represent a multilinear map?

well perhaps we could take a cue from the representation of the scalar product, i.e. of a tensor of type (0,2), as a matrix, i.e. technically as a tensor of type (1,1), just by preceding one of the argument vectors by transposing it.

So maybe we could represent all tensors of type (r,s) this way. i.e. given r arguments from the dual space and s arguments from the space, transpose all the dual guys until they become vectors, then just contract them,...

i.e. given an n by n by n by...n array of numbers, and an n tuiple of numbers, just pick a direction in the axes of the matrix, and contract all guys pointing in that direction by your n tuple. then you have an array of one dimension sdmaller, and can contract it against another nutpole (oops, i meant n tuple), in any direction you like, etc etc..

Last edited: Sep 18, 2004
7. Sep 18, 2004

### jcsd

Sorry it took me a little time to reply, here's my thoughts on the subject (some of it was written before I read some of the other posts on the thread so it re-iterates a few points already made).

I quite like the matrix represnetations of tensors as it's useful for transformations, but most texts which cover the subject just say on the topic "matrix notation fails for tensors than greater than rank two" without bothering to say why. As I said before the anwer is that, though a tensor of rank r +s can have the same components as a (n x n^(r + s -1)/ n^(r +s -1) x n) matrix, it can't fully represent the relationships between the components (as I said before).

So then I thought: can matrix notation be extended? The most obvious extension of matrix representations for tensors of rank > 2 would be generalized (n x n x n.....) arrays, which I suppose could be represnted by 'matrices' whose elements were themselves matrices or for tensors for greater than rank 4 generalized arrays (tis would probably be the best place to start for defining an algebra for these genralized arrays). Infact I was quite pleased with the idea as I'd never heard anyone using the concept of using arrays with more than two 'ways'. Of course if you ever have an idea that's remotely good, you can be sure that someone has already thought of it before you (and in general the better the idea the more people that have thoguht of it) and a quick google brought up a paper under the heading 'quasi-tensors' which compared two different ways of representing three-way arrays, a concept it appears that was first thought of in 1960 (also interstingly the serach brought up a chemical engineering textbook on multi-way arrays). So anyway, robbed of any delusions that I may of had an original idea I started to think exactly how these mathematical objects related to tensors (which was why I asked the original question as there doesn't seem to be alot of information about these objects and I was wondering what algebras had been associated with them).

An n x 1/1 x n matrix is obviusoly a vector of some sort, simalirly an n x n matrix is obviously a rank tensor of some sort. If you define a column vector as a vector and a row vector as a covector, then an n x n matrix is a tensor of type 1 + 1, but if you generalize the array to a three-way array a row vector is an n x 1 x 1 array, a column vector is a 1 x n x 1 array, so what what kind of vector is a 1 x 1 x n array (of course the problem mutplies each time you consider a new 'way' in a genarlized array)? It seems to me that it is just a matter of definiton and to represent a tensor by an (r+s)-way array, you need to define each 'way' of a genrealized array as either contravariant or covariant and we should be able to recover the algebra of tensors in a failry natural manner (the defintion of tensor additon and subtraction becomes clear starighrt away and I'm pretty sure that the inner and outer products can be defined logically, though I haven't yet tried to define them in terms of these genralized arrays).

It is certainly an interesting topic.

8. Sep 18, 2004

### mathwonk

maybe your book just meant that 3 dimesnional matrcies are harder to write on the page. clearly if you have a 3 diemnsional array of numbers laid out on the coordinate points of the unit cube in x,y,z space, then given a vector, you could dot it with saty all the vertical vectors in the cube and get a square matrix, i.e. tensor of type (1,1). thus such an array would be a linear map from vectors to tensors of type (1,1), hence itself a tensor.

9. Sep 18, 2004

### jcsd

Looking at the partciular book, it was published in 1959!

10. Sep 18, 2004

### mathwonk

does that sound like a long time ago to you??

11. Sep 19, 2004

### mathwonk

this question is helping me understand the various points of view on tensors better.

i.e. Some people think of tensors in a way analogous to the way they think of linear maps, as matrices, whereas i also think of tensors the way i think of linear maps, via the axioms they satisfy. I follow Emil Artin's advice in his book Geometric Algebra, never introduce matrices unless you need to compute something, like a determinant, then throw them out again afterwards.

But getting back to the question of "arrays" of higher dimension than 2, recall actually there are many ways to write a matrix, i.e. some people like me usually just write a single letter like M, or if they want to represent the entries, a general letter for the entries with subscripts, like {aij} (sorry about them subscripts).

Now there is no hindrance to writing more subscripts, like {aijk} and getting a representation for a three dimensional matrix, or array, so actually, unless you want to write one out physically in space, this is a pretty good way to write a 3 diml array.

Thus the method of writing letters with subscripts really is the matrix representation of a tensor. And it works as well for that, as writing n tuples of numbers works for writing vectors in dimensions higher than two i.e. if you think (a1,...,an) is a vector in n dimensions even though you do not draw it, then also you can think of {aijkl} as a 4 diml tensor even though it is hard to lay out fully in 4 diml space.

Now a calculator can surely be programmed to actually multiply these things just using the indices, without looking at them in a physical picture, hence there is not much difference in a 3 diml array or a 10 diml array, and a tensor of those types.

abstractly, tensor multiplication by "contracting indices" is just the higher dimensional analog of expressions like [ aij ] [xi] = [yj] for multiplying matrices. of course you have to have upper and lower indices to tell the rows from the columns.

so using indices for tensors, is like using matrices for linear maps. the conceptual way on the other hand uses instead the fact that for a linear map we have f(x+y) = f(x) + f(y), and for a tensor we have this in each variable separately.

Absolutely true, at times one needs to be able to calculate something, and that is the only appropriate time for writing indices, or matrices. When one is only reasoning about them, or even making a formal calculation that depends only on multilinearity, indices and matrices are superfluous and cumbersome.

who for example would use matrices to check that if a matrix times each of several vectors is zero, then the same is true for multiplying by their sum?

Last edited: Sep 19, 2004
12. Sep 19, 2004

### matt grime

A quote you may like then, mathwonk:

A gentleman never uses bases unless he has to.

Don't know for sure who first said it, but it was told to me by one of my lecturers when I was a lowly undergraduate.

13. Sep 19, 2004

### mathwonk

i love that quote, Matt.

it reminds me however of one of my prolonged moments of stupidity, when joe kohn was trying to prove to me that something was continuous using coordinates. I objected that since there were no ccordinates in the problem originally, that he was not justified in introducing them to prove continuity.

i.e. i did not understand that if the statement i was proving could be formulated without using coordinates, then it was true provided it could be proved using them.