Polynomial equations whose coefficients are all matrices (was I shouldnot )

yeeyee
Messages
5
Reaction score
0
polynomial equations whose coefficients are all matrices (was "I shouldnot")

<< post edited by berkeman to improve clarity >>

That aside,
I am looking for a free downloadable linear algebra book that will teach me about finding roots in polynomial equations whose coefficients are all matrices.

Thank you very much
 
Last edited by a moderator:
Mathematics news on Phys.org


Welcome to PF yeeyee.

The sentence also doesn't make much sense to me. What is a polynomial equation whose coefficients are matrices?
 
Last edited by a moderator:


well comuchip, I think that the same way we have scalar coeffiecients we may have matrices coefficients in the polynomial.

formally it sounds ok.
 


Then why don't you give me an answer ?
 


Well think of it this way, you need to find the roots of each standalone polynomial and then find the ones common to the original polynomial.

for example:

Ax^2+Bx+C=0
where A,B,C are constant matrices coeffiecients.

so you need to solve:
a_{ij}x^2+b_{ij}x+c{ij}=0
and then after finding the roots of each equation, you reduce the list of roots to those common to each equation, those would the roots of the original polynomial.
but I guess this would work only if A,B,C are with the same number of rows and columns.
 


OK, so Wikipedia says:
The roots of a polynomial matrix over the complex numbers are the points in the complex plane where the matrix loses rank.
I suppose that means that the roots are those x for which the rank is non-maximal, which is, IIRC, equivalent to the matrix being non-invertible. So I guess one could compute the determinant and solve the remaining higher-order polynomial.
So in lqg's notation, let M be given by its entries
m_{ij} = a_{ij} x^2 + b_{ij} x + c_{ij}
and solve det(M) = 0 for x.
 


Assuming we do mean finding real roots (as opposed to matrix valued roots), then I think Compuchip's idea turns a very simple problem into a very hard one, with extra assumptions placed on the terms.

1) Who says that the matrices are square?
2) Why turn mxn simpler equations in 1 unknown into one very high degree equation? Just solve each in turn, and see if there is some number in all of the solution sets as in lqg's post? If you can't solve the simpler equations you certainly can't solve the one of much higher degree.
3) Just because a determinant is zero does not mean that the matrix is identically zero, which is what is required.

Are we sure that we're looking for roots over R, and not some other ring, like the matrix ring itself?
 


loop quantum gravity said:
but I guess this would work only if A,B,C are with the same number of rows and columns.

If they don't then it doesn't make any sense to add up linear combinations of them.
 


n_bourbaki said:
If they don't then it doesn't make any sense to add up linear combinations of them.
yes ofcourse.
 
  • #10


n_bourbaki said:
1) Who says that the matrices are square?
I don't think anyone did?

n_bourbaki said:
2) Why turn mxn simpler equations in 1 unknown into one very high degree equation? Just solve each in turn, and see if there is some number in all of the solution sets as in lqg's post? If you can't solve the simpler equations you certainly can't solve the one of much higher degree.
Probably to determine if the rank is maximal one can indeed also solve each equation separately and check which solutions are roots of two or more equations.

3) Just because a determinant is zero does not mean that the matrix is identically zero, which is what is required.
I found the phrase "loses rank" a bit unclear. I thought it meant that the rank is lower than the maximal rank. But it might indeed also mean that the rank becomes zero.

Are we sure that we're looking for roots over R, and not some other ring, like the matrix ring itself?
The variable is x which is an element of R, not a matrix. So... yes, I guess so.

As I indicated in my first post, I've never seen such equations and all the information I have on them I found on Google. So I might be wrong here.
 
  • #11


CompuChip said:
I don't think anyone did? [mention anything about square matrices]

You did when you took the determinant.

The variable is x which is an element of R, not a matrix. So... yes, I guess so.

No, no one has said that the polynomial equation is to be viewed as a function from R to matrices. The OP has posed an ambiguous question. It might be most sensible, and probably only possible, if we view this as wishing for x to be in R.
 
Back
Top