On Matrices

Gold Member
I have a few questions relating to matrices.

1. All of the matrices I've worked with so far dealt with real numbers or real functions of real numbers. Can you work instead in complex numbers, and do you have to add or remove any rules because of this?

2. All of the matrices I've worked with so far have been in 2d, that is, they've all been m-by-n matrices. However, I was wondering if mathematicians explored the idea of m-by-n-by-o matrices, or three-dimensional matrices.

I have a few questions relating to matrices.
1. All of the matrices I've worked with so far dealt with real numbers or real functions of real numbers. Can you work instead in complex numbers, and do you have to add or remove any rules because of this?
Yes, the theory for complex matrices is very similar to that of real matrices. There are no extra rules that I know of. In fact, working with complex matrices simplifies a lot, in the context that there are some beautiful theorems that hold for complex matrices but not real matrices. A thing that comes to mind is that complex matrices are always be transformed in a triangular matrix, while real matrices do not have that property.

Sometimes however, you do need to add some rules. For examples, when you start working with inner products, then you will make separate definitions for complex or real matrices. Another example is hermitian matrices, which are an analogon for real-symmetric matrices.

In fact, almost the entire matrix theiry that you've seen with real numbers carry over to matrices over a general field (or even a commutative ring). There is no need for complex or real numbers. In fact, one can also look at rational matrices (but this theory is less satisfactory then real or complex matrices).

I actually find it strange that you've not worked with complex matrices so far. I find it very likely that they will introduce it very soon, because there is no reason to just assume real matrices.

2. All of the matrices I've worked with so far have been in 2d, that is, they've all been m-by-n matrices. However, I was wondering if mathematicians explored the idea of m-by-n-by-o matrices, or three-dimensional matrices.
This is a very good question, I have asked myself thesame question many times. But sadly, I don't know how you would define multiplication.

I'm pretty sure that the theory exists, but I don't know what it is. The reason that it isn't well known, is probably because there is no need for 3D-matrices. Matrices are handy because they represent linear and bilinear maps, because they represent linear systems of equations, etc. I don't see what extra benifits that 3D-matrices would give. Plus, it super hard to visualize 3D-matrices...

But I'm hoping that somebody else will comment on this, because I too want to know the answer to this question.

Last edited:
Extra: while I don't know how you would define 3D-matrices, I do know how to define 4D-matrices:

Just take a matrix whose entries are matrices themself. Thus you take $$M_n(M_m(\mathbb{K}))$$. This can be visualized as 4D-matrices. However, since the matrices are not commutative, this means that the matrices over matrices don't have a lot of good properties. For example, there is no easy notion of determinants of this matrices (this is actually a active field of research).

Gold Member
Extra: while I don't know how you would define 3D-matrices, I do know how to define 4D-matrices:

Just take a matrix whose entries are matrices themself. Thus you take $$M_n(M_m(\mathbb{K}))$$. This can be visualized as 4D-matrices. However, since the matrices are not commutative, this means that the matrices over matrices don't have a lot of good properties. For example, there is no easy notion of determinants of this matrices (this is actually a active field of research).
Well, if we can take matrices of matrices to get 4-d matrices, is it perhaps possible to take a vector (which could be considered a 1-d matrix), and then have each element of the vector be a 2-d matrix? Or perhaps the reverse option?

Yes, this is of course possible, but I don't see a way to define a suitable multiplication on this. If you have a vector of matrices, then the only way that you can multiply those would give you a nxn-matrix of matrices, or a 1x1-matrix of matrices. However, you will want to get a vector of matrices again.

AlephZero
Homework Helper
In some ways you can consider a tensor as a sort of "n dimensional matrix".

In engineering, tensor equations are often rewritten as matrix equations, possibly because engineers don't like tensors, but also to make it easy to use numerical methods and computer software that already exist for matrices.

For example the stress-strain relationship for a material is really an equation involving two second-order and one fourth-order tensor, but it is often written as a multiplication of 6x6 matrices, even though the matrix form obscures how to transform the matrix elements into different coordinate systems compared with the tensor forms.

The fourth-order stress-strain relationship tensor has 81 elements, but for an isotropic material the large number of symmetries mean that there are only two independent quantities. So the whole tensor can be defined by two physical parameters - for example one way to do it is using Young's modulus and Poisson's ratio. Even for the most general anisotropic material, the symmetry conditions that are caused by (non-relativistic) 3-D space being isotropic mean there are "only" 21 independent material properties, not 81.

But this rewriting of tensors as matrices doesn't look like a "matrix of matrices".

As micromass said, the problem with the idea of an "m x n x o matrix" is that it is hard to define anything corresponding to matrix multiplication, except by "slicing" it into a vector of conventional matrices - but as soon as you do that, you are treating one of the three "dimensions" as special compared with the other two.

What you can do on 3D-matrices is performing some kind of ternary multiplication (that is, a multiplication with 3 elements).

Remember that multiplication for matrices is defined as

$$(A\cdot B)_{i,j}=\sum_{k=0}^n{A_{i,k}B_{k,j}}$$

This generalizes to 3D-matrices in the following way, for 3D-matrices A,B and C, we define [A,B,C] as

$$[A,B,C]_{i,j,l}=\sum_{k=0}^n{A_{i,j,k}B_{i,k,l}C_{k,j,l}}$$

and this can be generalized to multidimensional matrices.

Now, what good propertues does the ternary multiplication have? Well, the binary operation . satisfies associativity, so clearly the ternary multiplication should have associativity as well.
We call this property the para-associativity and it says that

$$[[a,b,c],d,e]=[a,[b,c,d],e]=[a,b,[c,d,e]]$$

I strongly suspect that ternary multiplication satisfies this.

Gold Member
What you can do on 3D-matrices is performing some kind of ternary multiplication (that is, a multiplication with 3 elements).

Remember that multiplication for matrices is defined as

$$(A\cdot B)_{i,j}=\sum_{k=0}^n{A_{i,k}B_{k,j}}$$

This generalizes to 3D-matrices in the following way, for 3D-matrices A,B and C, we define [A,B,C] as

$$[A,B,C]_{i,j,l}=\sum_{k=0}^n{A_{i,j,k}B_{i,k,l}C_{k,j,l}}$$

and this can be generalized to multidimensional matrices.

Now, what good propertues does the ternary multiplication have? Well, the binary operation . satisfies associativity, so clearly the ternary multiplication should have associativity as well.
We call this property the para-associativity and it says that

$$[[a,b,c],d,e]=[a,[b,c,d],e]=[a,b,[c,d,e]]$$

I strongly suspect that ternary multiplication satisfies this.

You suspect, yes. But how would we prove that? I mean, we have a formula for the entry of the ternary product. Can we use that to prove para-associativity?

I'm going to go try that now.

Well, yeah, use the formula to prove it and do a lot of manipulation. I'm really bad at such things, so I won't try it But do tell us the result!

After para-associativity, there are some other things that ternary multiplication could satisfy:
- Is there a zero-element: i.e. is there a 0 such that [0,x,y]=[x,y,0]=0. this should be the zero matrix, but is this unique?
- Is there an identity: i.e. is there a matrix 1 such that [1,1,x]=[x,1,1]=x for all $$x\neq 0$$. And is this 1 unique?
- How does ternary multiplication behave w.r.t. addition and scalar multiplication? i.e. is it true that $$[\alpha x+\beta y,c,d]=\alpha[x,c,d]+\beta[y,c,d]$$.

These are some exciting questions you could ask for this multiplication.