1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Calculating Eigenvalues help

  1. Dec 13, 2017 #1
    1. The problem statement, all variables and given/known data

    upload_2017-12-13_23-37-54.png
    2. Relevant equations


    3. The attempt at a solution
    I solved it by calculating the eigen values by ##| A- \lambda |= 0 ##.

    This gave me ## \lambda _1 = 6.42, \lambda _2 = 0.387, \lambda_3 = -0.806##.

    So, the required answer is 42.02 , option (b).

    Is this correct?

    The matrix is symmetric. Is there any other easier wat to find the answer?
     

    Attached Files:

  2. jcsd
  3. Dec 13, 2017 #2

    LCKurtz

    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Do you know the relation between the sum of the squares of the eigenvalues and the trace of a matrix?
     
  4. Dec 13, 2017 #3
    No.
     
  5. Dec 13, 2017 #4

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    Google is your friend. Try looking up "trace of a matrix".
     
  6. Dec 13, 2017 #5

    fresh_42

    User Avatar
    2017 Award

    Staff: Mentor

    I get the same result.
     
  7. Dec 13, 2017 #6

    StoneTemplePython

    User Avatar
    Science Advisor
    Gold Member

    The fact that its symmetric leads to some very nice results. What result do you get if you square each entry in your matrix and then sum them? This is called a squared Frobenius norm (which is one way of generalizing the L2 norm for vectors to matrices).
     
  8. Dec 13, 2017 #7

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member
    2017 Award

    I would like to add to what the previous posters said that you should not round your numbers while doing your computations. Keep them on exact form to the very end and only evaluate the numbers in the end if necessary. You should find that the answer is exactly 42, not 42.02.
     
  9. Dec 13, 2017 #8
    I googled it.
    I got tr(A) = ##\Sigma \lambda_i## , i= 1,2,3.
    and Det (A) = ## \Pi \lambda_i## , i= 1,2,3.
    Better to use ## tr (A^2) = \Sigma {\lambda_i}^2## , i= 1,2,3.

    ## Tr(A^n) = Tr(D^n)##, where D is a similar matrix of A.
    In case of a matrix having all distinct eigen values, D can be the diagonal matrix consisting of ##\lambda_i##.
    In that case #### Tr(A^n) = Tr(D^n) = \Sigma {\lambda_i}^n## , i= 1,2,3.

    But, the above two equation will not give the result and it is complcated, too.
    It will be better to calculate the trace directly as I did in OP.
    I was looking for something like this. The sum is 42.

    So, is it true for any symmetric matrix?
    How to prove it?

    ## Tr (A^2) = \Sigma_i (A^2)_{ii}
    \\ (A^2)_{ii} = \Sigma_j (A_{ij} A_{ji})##
    For symmetric matrix, ## A_{ij} = A_{ji}##.
    So, ## (A^2)_{ii} = \Sigma_j {(A_{ij} })^2
    \\ Tr (A^2) = \Sigma_i \Sigma_j {(A_{ij} })^2##
    Is this correct?
     
  10. Dec 13, 2017 #9

    Orodruin

    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper
    Gold Member
    2017 Award

    For ##n = 2## it directly gives you the result. You just need to square the matrix and sum the diagonal of the result, very simple. If you go via the eigenvalues you need to solve for the roots of an order 3 polynomial.

    Essentially. However, I think it is easier to go the other way and just see that ##A_{ij}A_{ij} = \mbox{tr}(AA^T)##. Since ##A## is symmetric, ##AA^T = A^2##.
     
  11. Dec 13, 2017 #10

    StoneTemplePython

    User Avatar
    Science Advisor
    Gold Member

    note: we are dealing in reals for this post. Your approach is close, and maybe even correct, but I find it hard to follow.

    My strong preference here is to block your matrix by column vectors.

    Suppose you have some matrix ##\mathbf X##, partitioned by columns below

    ##\mathbf X = \bigg[\begin{array}{c|c|c|c|c}
    \mathbf x_1 & \mathbf x_2 &\cdots & \mathbf x_{n-1} & \mathbf x_n\end{array}\bigg]##

    to make the link with the traditional L2 norm for vectors, consider the vec operator

    ##
    vec\big(\mathbf X\big) = \begin{bmatrix}
    \mathbf x_1 \\
    \mathbf x_2\\
    \vdots \\
    \mathbf x_{n-1}\\
    \mathbf x_n
    \end{bmatrix}##

    which stacks each column of the matrix ##\mathbf X## on top of each other into one big vector. (The vec operator will show up again if and when you start dealing with Kronecker products.)

    Our goal is to add up each squared component of ##\mathbf X## into a sum. do you understand why

    ##\big \Vert \mathbf X \big \Vert_F^2 = \sum_{j=1}^n\sum_{i=1}^n x_{i,j}^2 = trace\big(\mathbf X^T \mathbf X\big) = vec\big(\mathbf X\big)^Tvec\big(\mathbf X\big)= \big \Vert vec\big(\mathbf X\big) \big \Vert_2^2##

    is true for any real matrix?

    Now since ##\mathbf X## is symmetric, we have ##\mathbf X^T = \mathbf X## meaning that

    ##\big \Vert \mathbf X \big \Vert_F^2 = trace\big(\mathbf X^T \mathbf X\big) = trace\big(\mathbf X \mathbf X\big) = trace\big(\mathbf X^2\big)##

    now you just need the fact that others mentioned, i.e. relating a trace of a matrix and its eigenvalues (or in this case the trace of a matrix to the second power gives sum of eigenvalues to second power).

    Why is this fact true? (Hint: use characteristic polynomial, or if you prefer an easy but less general case: real symmetric matrices are diagonalizable -- do that and apply cyclic property of trace.)

    Trace is absurdly useful, so its worth spending extra time understanding all the related details of this problem.
     
  12. Dec 13, 2017 #11
    Then, for the anti - symmetric matrix, ##tr (A^2)= tr (-AA^T) = - A_{ij}A_{ij}## = negative of the sum of the elements of the matrix. Right?
     
  13. Dec 13, 2017 #12

    StoneTemplePython

    User Avatar
    Science Advisor
    Gold Member

    Yes.

    note that in general over real ##n## x ##n## matrices,

    ##\big \vert trace\big(\mathbf A \mathbf A \big)\big \vert = \big \vert trace \Big( \big( \mathbf A^T \big)^T \mathbf A\Big)\big\vert \leq trace\big(\mathbf A^T \mathbf A\big) = \big \Vert \mathbf A\big \Vert_F^2 ##

    with equality iff ##\mathbf A## is a scalar multiple of ##\mathbf A^T##.

    You could prove this with Schur's Inequality. Alternatively (perhaps using the vec operator to help) recognize that the trace gives an inner product. Direct application of Cauchy Schwarz gives you

    ##\big \vert trace\big(\mathbf B^T \mathbf A \big) \big \vert = \big \vert vec\big( \mathbf B\big)^T vec\big( \mathbf A\big)\big \vert \leq \big \Vert vec\big( \mathbf B\big)\big \Vert_2 \big \Vert vec\big( \mathbf A\big)\big \Vert_2 =\big \Vert \mathbf B \big \Vert_F \big \Vert \mathbf A \big \Vert_F##

    with equality iff ##\mathbf B = \gamma \mathbf A##. (Also note trivial case: if one or both matrices is filled entirely with zeros, then there is an equality.)

    In your real skew symmetric case, ##\mathbf B = \mathbf A^T## and ##\gamma = -1##. And of course in the real symmetric case ##\gamma = 1##
     
  14. Dec 13, 2017 #13
    I missed to write square of the elements.
    The corrected one:
    Then, for the anti - symmetric matrix, ##tr (A^2)= tr (-AA^T) = - A_{ij}A_{ij}## = negative of the sum of the square of the elements of the matrix.
     
  15. Dec 15, 2017 #14

    epenguin

    User Avatar
    Homework Helper
    Gold Member

    I am not very familiar with some of the algebra mentioned, though I don’t think it is very difficult.

    However, it seems possible to solve the problem without it knowing all this, though I’m sure it does no harm to know it.

    You could write out the eigenvalue equation as a cubic equation. The value of the sums of roots ∑λi is well known. The value of the sum of products of two roots, Σλiλj is well known. From this you could get the sum of squares of roots Σλi2.

    I have not looked into it, but from what was being said about symmetry I suspect it would be easy to solve this cubic.
     
    Last edited: Dec 15, 2017
  16. Dec 15, 2017 #15

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    Sometimes symmetry does not help at all. For example, the matrix
    $$A = \pmatrix{1&2&3\\2&4&5\\3&5&6}$$
    has eigenvalues that are pretty horrible expressions involving cube roots and arctangents of things involving square roots, and the like.
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Loading...