# Polynomial equations whose coefficients are all matrices (was I shouldnot )

1. Jul 20, 2008

### yeeyee

polynomial equations whose coefficients are all matrices (was "I shouldnot")

<< post edited by berkeman to improve clarity >>

That aside,
I am looking for a free downloadable linear algebra book that will teach me about finding roots in polynomial equations whose coefficients are all matrices.

Thank you very much

Last edited by a moderator: Jul 21, 2008
2. Jul 20, 2008

### CompuChip

Re: I shouldnot

Welcome to PF yeeyee.

The sentence also doesn't make much sense to me. What is a polynomial equation whose coefficients are matrices?

Last edited by a moderator: Jul 21, 2008
3. Jul 20, 2008

### MathematicalPhysicist

Re: I shouldnot

well comuchip, I think that the same way we have scalar coeffiecients we may have matrices coefficients in the polynomial.

formally it sounds ok.

4. Jul 20, 2008

### yeeyee

Re: I shouldnot

Then why dont you give me an answer ?

5. Jul 20, 2008

### MathematicalPhysicist

Re: I shouldnot

Well think of it this way, you need to find the roots of each standalone polynomial and then find the ones common to the original polynomial.

for example:

Ax^2+Bx+C=0
where A,B,C are constant matrices coeffiecients.

so you need to solve:
$$a_{ij}x^2+b_{ij}x+c{ij}=0$$
and then after finding the roots of each equation, you reduce the list of roots to those common to each equation, those would the roots of the original polynomial.
but I guess this would work only if A,B,C are with the same number of rows and columns.

6. Jul 20, 2008

### CompuChip

Re: I shouldnot

OK, so Wikipedia says:
I suppose that means that the roots are those x for which the rank is non-maximal, which is, IIRC, equivalent to the matrix being non-invertible. So I guess one could compute the determinant and solve the remaining higher-order polynomial.
So in lqg's notation, let M be given by its entries
$$m_{ij} = a_{ij} x^2 + b_{ij} x + c_{ij}$$
and solve det(M) = 0 for x.

7. Jul 20, 2008

### n_bourbaki

Re: I shouldnot

Assuming we do mean finding real roots (as opposed to matrix valued roots), then I think Compuchip's idea turns a very simple problem into a very hard one, with extra assumptions placed on the terms.

1) Who says that the matrices are square?
2) Why turn mxn simpler equations in 1 unknown into one very high degree equation? Just solve each in turn, and see if there is some number in all of the solution sets as in lqg's post? If you can't solve the simpler equations you certainly can't solve the one of much higher degree.
3) Just because a determinant is zero does not mean that the matrix is identically zero, which is what is required.

Are we sure that we're looking for roots over R, and not some other ring, like the matrix ring itself?

8. Jul 20, 2008

### n_bourbaki

Re: I shouldnot

If they don't then it doesn't make any sense to add up linear combinations of them.

9. Jul 20, 2008

### MathematicalPhysicist

Re: I shouldnot

yes ofcourse.

10. Jul 20, 2008

### CompuChip

Re: I shouldnot

I don't think anyone did?

Probably to determine if the rank is maximal one can indeed also solve each equation separately and check which solutions are roots of two or more equations.

I found the phrase "loses rank" a bit unclear. I thought it meant that the rank is lower than the maximal rank. But it might indeed also mean that the rank becomes zero.

The variable is x which is an element of R, not a matrix. So... yes, I guess so.

As I indicated in my first post, I've never seen such equations and all the information I have on them I found on Google. So I might be wrong here.

11. Jul 20, 2008

### n_bourbaki

Re: I shouldnot

You did when you took the determinant.

No, no one has said that the polynomial equation is to be viewed as a function from R to matrices. The OP has posed an ambiguous question. It might be most sensible, and probably only possible, if we view this as wishing for x to be in R.