Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Matrices; eigen value help

  1. Aug 11, 2011 #1
    1. The problem statement, all variables and given/known data

    2. Relevant equations

    3. The attempt at a solution

    the usual method i.e. det(A - bI) = 0

    i get the equation finally as b[3][/SUP] - 75b[2][/SUP] + 1850b -15576 = 0

    from this i get b[1][/SUB][2][/SUP] + b[2][/SUB][2][/SUP] + b[3][/SUB][2][/SUP] = 1925 < 1949

    is there an easier/more creative method??

    forgot to add tags .. dont delete..

    Attached Files:

    • 2a.jpg
      File size:
      32.5 KB
    Last edited: Aug 11, 2011
  2. jcsd
  3. Aug 11, 2011 #2


    User Avatar
    Science Advisor

  4. Aug 11, 2011 #3
    Another way to look at this: Are you familiar with the fact that every complex square matrix is unitarily equivalent to an upper triangular matrix and the properties that are invariant under such an equivalence?
  5. Aug 12, 2011 #4
    no idea... never heard of it before at all!!!! :smile:
  6. Aug 12, 2011 #5
    Hmmm, perhaps HallsofIvy has something else in mind that is simpler than my plan of attack. I'm not sure how to solve this problem by just looking at the spectral radius though, so I can't help with that.

    I'll give you a quick rundown on what I was getting at, and I'll provide a few references...

    Are you familiar with the notion of "similar matrices?" If not, the general idea is that two square matrices A,B are similar when there exists an invertible P such that

    [tex] A = P^{-1}BP.[/tex]

    It can be shown that similar matrices share a lot of properties and we say that these properties are invariant under similarity. This definition may look strange, but if you study the matrix representations of abstract linear transformations with respect to different bases, it turns out that those matrices will be related via this "similarity" equation [itex] A=P^{-1}BP.[/itex] The matrix P is called the "change-of-basis" matrix.

    Now, two matrices A,B are unitarily equivalent when there exists a unitary matrix U such that

    [tex] A = U^* BU.[/tex]

    A unitary matrix [itex]U[/itex] is a matrix whose conjugate transpose [itex]U^*[/itex] is equal to its inverse. So, then, by definition unitarily equivalent matrices ARE similar, but the converse is in general false. So since unitary similarity is a "stronger" condition on matrices in a certain sense, we expect that there will be additional properties that are invariant under unitary similarity in addition to the ones that are invariant under "standard" similarity.

    Here's an example. The trace of a matrix is an invariant property under similarity. But for unitary similarity, not only is the trace invariant, but (1) the sum of the squares of the absolute values of ALL the matrix entries is invariant (or, in other words, the http://en.wikipedia.org/wiki/Frobenius_norm#Frobenius_norm" is invariant). So, therefore, we use the fact that ANY matrix is unitarily equivalent to an upper triangular matrix (and where are the eigenvalues on an upper triangular matrix? :smile:) and property (1) to find the bound [itex]\sqrt{1949}.[/itex]

    If you want to learn why every matrix is unitarily equivalent to an upper triangular matrix, you can learn it from where I learned it, http://books.google.com/books?id=Pl...CDYQ6AEwBA#v=onepage&q=theorem schur&f=false", on page 79. A proof of property (1) is back on page 73. :biggrin:
    Last edited by a moderator: Apr 26, 2017
  7. Aug 14, 2011 #6

    I like Serena

    User Avatar
    Homework Helper

    Nice problem! :smile:

    I've been puzzling on my own what you can say about this matrix.

    Note that the values on the main diagonal are large, while the others are small.
    This means the eigenvalues will be close to the values on the main diagonal.

    Furthermore, the matrix is *almost* symmetric.
    A symmetric matrix is unitarily equivalent to a diagonal matrix that has exactly the eigen values on its main diagonal and the rest are zeroes (whereas an upper triangular matrix has its eigenvalues on the main diagonal).

    If the matrix were symmetric, its frobenius norm would be identical to the root of the sum of the squared eigenvalues.
    As it is the frobenius norm is just a little higher.

    The determinant of the matrix is 15000, which is also the product of the eigenvalues.
    Any eigenvalues that are rational numbers, have to be whole numbers that divide 15000 (due to the Rational root theorem).
    Since we already know the eigenvalues have to be around 21, 26, and 28, the closest dividers to check are 20, 25, and 30.

    Furthermore the sum of the eigenvalues has to be the trace of the matrix, which is 75.
    Belo and hold: 20+25+30=75. We have a match! :cool:

    So we can quickly find the eigenvalues and calculate the exact value for the root requested.
  8. Aug 14, 2011 #7


    User Avatar
    Homework Helper

    You have the equation for the eigenvalues: b3 - 75b2 + 1850b -15576 = 0. If there are three real roots λ1, λ2, λ3, the equation can be written in the form (b-λ1)(b-λ2)(b-λ3)=0. Do the multiplications and compare the coefficients of each power of b with the ones in the original equation.
    You get λ122232 without calculating the λ-s.

  9. Aug 14, 2011 #8

    I like Serena

    User Avatar
    Homework Helper

    One small correction, the equation should be: b3 - 75b2 + 1850b - 15000 = 0. :wink:
  10. Aug 14, 2011 #9

    I like Serena

    User Avatar
    Homework Helper

    Nice! :smile:
    I just got what you were hinting at.
    It doesn't even matter that the last coefficient is wrong!
  11. Aug 14, 2011 #10


    User Avatar
    Homework Helper

    You were very fast!
    I did not check the equation :shy: Luckily, the last term does not matter.

Share this great discussion with others via Reddit, Google+, Twitter, or Facebook