1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

General Solution for Eigenvalues for a 2x2 Symmetric Matrix

  1. May 19, 2014 #1
    1. The problem statement, all variables and given/known data

    From Mary Boas' "Mathematical Methods in the Physical Science" 3rd Edition Chapter 3 Sec 11 Problem 33 ( 3.11.33 ).

    Find the eigenvalues and the eigenvectors of the real symmetric matrix.

    $$M=\begin{pmatrix} A & H \\ H & B \end{pmatrix}$$

    Show the eigenvalues are real and the eigenvectors are perpendicular.

    2. Relevant equations

    $$D={ C }^{ -1 }MC$$


    3. The attempt at a solution

    The second part of the problem was easily proven using a variation of the proof with hermitian matrices.

    The first part produces horrible algebraic messes with the two different ways I have approached this. For example, click the link:

    https://www.wolframalpha.com/input/?i=determinant+{{a-x,H},{H,b-x}}=0

    Is there an elegant way to find a general solution for the 2x2 symmetric matrix? No spoilers, but hints appreciated.

    Thanks,
    Chris Maness
     
    Last edited: May 19, 2014
  2. jcsd
  3. May 19, 2014 #2

    jbunniii

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    We can answer this part of the question without having to calculate the eigenvalues at all:
    Indeed, this is true for any real symmetric matrix (or more generally, any hermitian matrix), not just 2x2. Let ##M## be any hermitian matrix, so ##M^* = M##. If ##\lambda## is an eigenvalue and ##x## is a corresponding eigenvector, then by definition ##Mx = \lambda x##. Therefore ##x^* M x = \lambda(x^* x)##. Since ##x^* M x## and ##x^* x## are both real numbers (why?), this means ##\lambda## is also real.

    This part is not necessarily true without further assumptions:
    For example, if ##A=B=1## and ##H = 0## then ##M## is the identity matrix. Every nonzero vector is an eigenvector of the identity matrix, with eigenvalue equal to 1. But it is possible to choose two orthogonal eigenvectors.

    Finally, as for explicitly calculating the eigenvalues in the 2x2 case, did you try simply calculating the determinant of ##M - \lambda I## and setting it equal to zero?
    $$\det\left(\begin{matrix}
    A - \lambda & H \\
    H & B - \lambda \\
    \end{matrix}\right) = 0$$
    This should give you a quadratic equation, and you know how to solve those...
     
  4. May 19, 2014 #3

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    Show your work; for all we know, maybe what you have already is about the best way to do the question---or maybe not. How can we tell? Your wolfram link gets the eigenvalues OK, but what about the eigenvectors?
     
    Last edited: May 19, 2014
  5. May 19, 2014 #4

    BruceW

    User Avatar
    Homework Helper

    yeah, maybe there is no nice way to do it. you could use your relevant equation. But I don't think that is super-simple to do by hand either.
     
  6. May 19, 2014 #5
    Ok, that is all I wanted to know. Because I know it could be done a long tedious way, but I get the concept, so I don't see to much point in dealing with a huge algebra mess. I am self studying, so I am the one picking which questions I do. I can almost see a pattern with the simple symmetric matrices, but none of my little rules worked for all of them. I worked at that for some time last night.

    Chris Maness
     
  7. May 19, 2014 #6

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    The only possible problem would be with the radical ##\sqrt{a^2-2ab+b^2+4H^2}##, and that is obviously real (why?).


    jbunniii's hint is much more elegant. It works on any NxN real symmetric matrix. Dealing with determinants gets very messy with N=3, and downright ugly for any larger N.
     
  8. May 19, 2014 #7
    I thought that is what I did in the wolfram link -- that is set the determinate of the aforementioned equation equal to zero. Yes, I worked two 5x5 matrices before, and I swore them off. That is why God made computers :D

    Thanks,
    Chris Maness
     
  9. May 19, 2014 #8

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    You didn't. Look at your radicand, ##a^2-2ab+b^2+4H^2##. The ##4H^2## term is obviously positive. What can you say about ##a^2-2ab+b^2##?

    Hint: The sum of two non-negative numbers is ... ?
     
  10. May 19, 2014 #9
    Yes, real and positive. I got that earlier (stated above), but here it is for fun:

    Start with two of the same equations:

    $$\hat { H } { r }_{ 1 }=\lambda { r }_{ 1 }$$

    For the first one take the $$\dagger $$ of both sides. This yields:

    $$\hat {({ H } { r }_{ 1 })}^{\dagger}={(\lambda { r }_{ 1 })}^{\dagger}$$ is identical to:

    $${ { r }_{ 1 } }^{ \dagger }\hat { H } ={ { \lambda }^{ \star }{ r }_{ 1 } }^{ \dagger }$$

    multiply the right side of both sides by regular ol' $$r_{1}$$

    Multiply the left side of the non dagger equation by r 1 dagger. This is the inner product and equals a non zero real/positive number. Subtract the two and the left side goes to zero, so you have:

    $$0={ \left( { \lambda }_{ 1 }-{ \lambda }_{ 1 }^{ \star } \right) { r }_{ 1 } }^{ \dagger }{ { r }_{ 1 } }$$

    Since the inner product is not zero, the lamdas have to equal. Since one is starred, it has to be wholly real.

    Thanks,
    Chris Maness
     
  11. May 19, 2014 #10

    jbunniii

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Regarding the orthogonality of the different eigenvectors, as I noted above it's not automatically true: the identity matrix is a counterexample where every nonzero vector is an eigenvector.

    But what is true is that for a real symmetric (or more generally, hermitian) matrix, eigenvectors corresponding to distinct eigenvalues are orthogonal. To see this, suppose that ##\lambda## and ##\mu## are eigenvalues, with corresponding eigenvectors ##x## and ##y##. Assume further that ##\lambda \neq \mu##. Then we have, by definition,
    $$Mx = \lambda x \text{ and } My = \mu y$$
    Now premultiply the first equation by ##y^*## and the second equation by ##x^*## to obtain
    $$y^*Mx = \lambda y^*x \text{ and } x^*My = \mu x^* y$$
    Now conjugate the first equation and compare with the second, using the fact that the eigenvalues are real.
     
  12. May 19, 2014 #11

    AlephZero

    User Avatar
    Science Advisor
    Homework Helper

    That is true, but the only 2x2 symmetric (or hermitian) matrices with two equal eigenvalues are multiples of the identity matrix, which is a rather trivial special case.

    Of course bigger hermitian matrices can have equal eigenvalues and have non-zero off-diagonal terms as well.

    When a hermitian matrix has repeated eigenvalues, you can always find a complete set of orthogonal eigenvectors (which is a very useful property of the matrix), even though the vectors corresponding to the repeated values are not unique
     
  13. May 19, 2014 #12

    jbunniii

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    True, but this special case was not excluded in the problem statement, as far as I can tell.
    Yes, I assume the intent of the question is that it's always possible to make the eigenvectors orthogonal. This is clearly true in the 2x2 case, since as you said, the only situation where it's not automatic is for multiples of the identity. But the general case is more involved - essentially the spectral theorem.
     
  14. May 19, 2014 #13

    BruceW

    User Avatar
    Homework Helper

    even if we just keep to 2x2 case, and if we exclude M from being a multiple of the identity matrix, there is still no 'nice' way to find the eigenvectors or eigenvalues, right? I think this was kq6up's main question. I would also be interested to know if there is a nice way to do it, since I can't think of any, and some elegant method would be super-useful :)

    edit: I guess it's not too much work to just compute the determinant to find eigenvalues, then plug in to find eigenvectors. But it's annoying, to do all this by hand, for a problem that looks like it could have a nice method to solve it.
     
  15. May 19, 2014 #14

    AlephZero

    User Avatar
    Science Advisor
    Homework Helper

    I guess it depends what you call "nice". Personally I wouldn't call the the quadratic equation you have to solve a "horrible algebraic mess." The form of the equation ##x^2 - (a+b)x + ab - h^2 = 0## gives you some insight into what is going on. For example, the sum of the roots is ##a+b##, independent of ##h##, and that is an example of a general result for any size of matrix, An eigenvalue is zero only if ##ab -h^2 = 0##, which is what you would expect, since ##ab - h^2## is the determinant of the matrix.

    I think it's optimistic to expect any "nice" method here. If you know all the eigenvalues and vectors of a matrix, then any calculation involving the matrix is simple, because you can easily diagonalize the matrix. So you shouldn't expect to a "free lunch" by finding a simple way to solve the eigenproblem! Any general of finding all the eigenvalues and vectors for an ##n\times n## matrix requires of the order of ##n^3## arithmetic operations.
     
    Last edited: May 19, 2014
  16. May 19, 2014 #15

    jbunniii

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Probably the only truly "nice" case is 1x1. :biggrin:

    Things become rapidly more complicated for 3x3 and 4x4, since the formulas for solving cubic and quartic equations are nasty. For 5x5 and above, there is no general formula at all, and it becomes a rather hard problem in numerical analysis. Here is a dense 680 page classic treatise on the subject, and it's old enough that it is nowhere near the current state of the art: The Algebraic Eigenvalue Problem by Wilkinson. The symmetric case gets its own treatment at only 416 pages: The Symmetric Eigenvalue Problem by Parlett.
     
  17. May 19, 2014 #16

    AlephZero

    User Avatar
    Science Advisor
    Homework Helper

    You don't have to read the whole of Wilkinson or Parlett to get started. This is a perfectly good method for "small" symmetric matrices (i.e. less than about 100 x 100): http://en.wikipedia.org/wiki/Jacobi_eigenvalue_algorithm

    Unlike some methods that are faster, it guarantees the calculated eigenvectors are orthogonal, even when there are closely spaced eigenvalues. That property means it is still used as part of more complicated algorithms.

    If you work through the algebra, it gives an alternative method of solving the 2x2 case (no iterations are required for a 2x2 matrix). The solution again involves a square root, which is no surprise, because the right answer is independent of how you do the algebra.
     
  18. May 20, 2014 #17

    BruceW

    User Avatar
    Homework Helper

    it's not possible to diagonalize any general square matrix. But yeah, it is pretty optimistic to expect a nice solution for symmetric matrices. I was hoping that the 2x2 matrix would be a special case with a nice way to get the answer. But I guess not. I'm always looking for that free lunch, anyway :)
     
  19. May 20, 2014 #18
    Yes, I have noticed in working problems in Boas if I have even a relatively ugly answer, then I am usually missing the point. She gives many problems with clean/elegant solutions. I don't have a solution manual for the third edition (if one even exist since she died shortly after its publication date). Chapter 3 has already ended in the 2nd edition, but it mostly matches the problems in the third edition. However, she put more sections at the end of chapter 3.

    Regards,
    Chris Maness
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: General Solution for Eigenvalues for a 2x2 Symmetric Matrix
Loading...