What are the easier ways to calculate eigenvalues for a symmetric matrix?

In summary: X\big)^Tvec\big(\mathbf X\big) = trace\big(\mathbf X^T \mathbf X\big)for any real matrix, this means that...vec\big(\mathbf X\big)^Tvec\big(\mathbf X\big) = vec\big(\mathbf X\big)^T(trace\big(\mathbf X^T \mathbf X)) = vec\big(\mathbf X\big)^T \mathbf XSo, the result of adding up each squared component of a real matrix is always the trace of the matrix.
  • #1
Pushoam
962
51

Homework Statement



upload_2017-12-13_23-37-54.png

Homework Equations

The Attempt at a Solution


I solved it by calculating the eigen values by ##| A- \lambda |= 0 ##.

This gave me ## \lambda _1 = 6.42, \lambda _2 = 0.387, \lambda_3 = -0.806##.

So, the required answer is 42.02 , option (b).

Is this correct?

The matrix is symmetric. Is there any other easier wat to find the answer?
 

Attachments

  • upload_2017-12-13_23-37-7.png
    upload_2017-12-13_23-37-7.png
    18.9 KB · Views: 446
  • upload_2017-12-13_23-37-54.png
    upload_2017-12-13_23-37-54.png
    8.6 KB · Views: 929
Physics news on Phys.org
  • #2
Do you know the relation between the sum of the squares of the eigenvalues and the trace of a matrix?
 
  • Like
Likes Pushoam
  • #3
LCKurtz said:
Do you know the relation between the sum of the squares of the eigenvalues and the trace of a matrix?
No.
 
  • #4
Pushoam said:
No.
Google is your friend. Try looking up "trace of a matrix".
 
  • Like
Likes Pushoam
  • #6
Pushoam said:
So, the required answer is 42.02 , option (b).

Is this correct?

The matrix is symmetric. Is there any other easier wat to find the answer?

The fact that its symmetric leads to some very nice results. What result do you get if you square each entry in your matrix and then sum them? This is called a squared Frobenius norm (which is one way of generalizing the L2 norm for vectors to matrices).
 
  • Like
Likes Pushoam
  • #7
I would like to add to what the previous posters said that you should not round your numbers while doing your computations. Keep them on exact form to the very end and only evaluate the numbers in the end if necessary. You should find that the answer is exactly 42, not 42.02.
 
  • Like
Likes Pushoam
  • #8
Ray Vickson said:
Google is your friend. Try looking up "trace of a matrix".
I googled it.
I got tr(A) = ##\Sigma \lambda_i## , i= 1,2,3.
and Det (A) = ## \Pi \lambda_i## , i= 1,2,3.
Better to use ## tr (A^2) = \Sigma {\lambda_i}^2## , i= 1,2,3.

## Tr(A^n) = Tr(D^n)##, where D is a similar matrix of A.
In case of a matrix having all distinct eigen values, D can be the diagonal matrix consisting of ##\lambda_i##.
In that case #### Tr(A^n) = Tr(D^n) = \Sigma {\lambda_i}^n## , i= 1,2,3.

But, the above two equation will not give the result and it is complcated, too.
It will be better to calculate the trace directly as I did in OP.
StoneTemplePython said:
The fact that its symmetric leads to some very nice results. What result do you get if you square each entry in your matrix and then sum them? This is called a squared Frobenius norm (which is one way of generalizing the L2 norm for vectors to matrices).
I was looking for something like this. The sum is 42.

So, is it true for any symmetric matrix?
How to prove it?

## Tr (A^2) = \Sigma_i (A^2)_{ii}
\\ (A^2)_{ii} = \Sigma_j (A_{ij} A_{ji})##
For symmetric matrix, ## A_{ij} = A_{ji}##.
So, ## (A^2)_{ii} = \Sigma_j {(A_{ij} })^2
\\ Tr (A^2) = \Sigma_i \Sigma_j {(A_{ij} })^2##
Is this correct?
 
  • #9
Pushoam said:
But, the above two equation will not give the result and it is complcated, too.
It will be better to calculate the trace directly as I did in OP.
For ##n = 2## it directly gives you the result. You just need to square the matrix and sum the diagonal of the result, very simple. If you go via the eigenvalues you need to solve for the roots of an order 3 polynomial.

Pushoam said:
## Tr (A^2) = \Sigma_i (A^2)_{ii}
\\ (A^2)_{ii} = \Sigma_j (A_{ij} A_{ji})##
For symmetric matrix, ## A_{ij} = A_{ji}##.
So, ## (A^2)_{ii} = \Sigma_j {(A_{ij} })^2
\\ Tr (A^2) = \Sigma_i \Sigma_j {(A_{ij} })^2##
Is this correct?

Essentially. However, I think it is easier to go the other way and just see that ##A_{ij}A_{ij} = \mbox{tr}(AA^T)##. Since ##A## is symmetric, ##AA^T = A^2##.
 
  • Like
Likes Pushoam
  • #10
Pushoam said:
I was looking for something like this. The sum is 42.

So, is it true for any symmetric matrix?
How to prove it?

## Tr (A^2) = \Sigma_i (A^2)_{ii}
\\ (A^2)_{ii} = \Sigma_j (A_{ij} A_{ji)}##
For symmetric matrix, ## A_{ij} = A_{ji}##.
So, ## (A^2)_{ii} = \Sigma_j {(A_{ij} }^2
\\ Tr (A^2) = \Sigma_i \Sigma_j {(A_{ij} })^2##
Is this correct?

note: we are dealing in reals for this post. Your approach is close, and maybe even correct, but I find it hard to follow.

My strong preference here is to block your matrix by column vectors.

Suppose you have some matrix ##\mathbf X##, partitioned by columns below

##\mathbf X = \bigg[\begin{array}{c|c|c|c|c}
\mathbf x_1 & \mathbf x_2 &\cdots & \mathbf x_{n-1} & \mathbf x_n\end{array}\bigg]##

to make the link with the traditional L2 norm for vectors, consider the vec operator

##
vec\big(\mathbf X\big) = \begin{bmatrix}
\mathbf x_1 \\
\mathbf x_2\\
\vdots \\
\mathbf x_{n-1}\\
\mathbf x_n
\end{bmatrix}##

which stacks each column of the matrix ##\mathbf X## on top of each other into one big vector. (The vec operator will show up again if and when you start dealing with Kronecker products.)

Our goal is to add up each squared component of ##\mathbf X## into a sum. do you understand why

##\big \Vert \mathbf X \big \Vert_F^2 = \sum_{j=1}^n\sum_{i=1}^n x_{i,j}^2 = trace\big(\mathbf X^T \mathbf X\big) = vec\big(\mathbf X\big)^Tvec\big(\mathbf X\big)= \big \Vert vec\big(\mathbf X\big) \big \Vert_2^2##

is true for any real matrix?

Now since ##\mathbf X## is symmetric, we have ##\mathbf X^T = \mathbf X## meaning that

##\big \Vert \mathbf X \big \Vert_F^2 = trace\big(\mathbf X^T \mathbf X\big) = trace\big(\mathbf X \mathbf X\big) = trace\big(\mathbf X^2\big)##

now you just need the fact that others mentioned, i.e. relating a trace of a matrix and its eigenvalues (or in this case the trace of a matrix to the second power gives sum of eigenvalues to second power).

Why is this fact true? (Hint: use characteristic polynomial, or if you prefer an easy but less general case: real symmetric matrices are diagonalizable -- do that and apply cyclic property of trace.)

Trace is absurdly useful, so its worth spending extra time understanding all the related details of this problem.
 
  • Like
Likes Pushoam
  • #11
Orodruin said:
For ##n = 2## it directly gives you the result. You just need to square the matrix and sum the diagonal of the result, very simple. If you go via the eigenvalues you need to solve for the roots of an order 3 polynomial.
Essentially. However, I think it is easier to go the other way and just see that ##A_{ij}A_{ij} = \mbox{tr}(AA^T)##. Since ##A## is symmetric, ##AA^T = A^2##.
Then, for the anti - symmetric matrix, ##tr (A^2)= tr (-AA^T) = - A_{ij}A_{ij}## = negative of the sum of the elements of the matrix. Right?
 
  • #12
Pushoam said:
Then, for the anti - symmetric matrix, ##tr (A^2)= tr (-AA^T) = - A_{ij}A_{ij}## = negative of the sum of the elements of the matrix. Right?

Yes.

note that in general over real ##n## x ##n## matrices,

##\big \vert trace\big(\mathbf A \mathbf A \big)\big \vert = \big \vert trace \Big( \big( \mathbf A^T \big)^T \mathbf A\Big)\big\vert \leq trace\big(\mathbf A^T \mathbf A\big) = \big \Vert \mathbf A\big \Vert_F^2 ##

with equality iff ##\mathbf A## is a scalar multiple of ##\mathbf A^T##.

You could prove this with Schur's Inequality. Alternatively (perhaps using the vec operator to help) recognize that the trace gives an inner product. Direct application of Cauchy Schwarz gives you

##\big \vert trace\big(\mathbf B^T \mathbf A \big) \big \vert = \big \vert vec\big( \mathbf B\big)^T vec\big( \mathbf A\big)\big \vert \leq \big \Vert vec\big( \mathbf B\big)\big \Vert_2 \big \Vert vec\big( \mathbf A\big)\big \Vert_2 =\big \Vert \mathbf B \big \Vert_F \big \Vert \mathbf A \big \Vert_F##

with equality iff ##\mathbf B = \gamma \mathbf A##. (Also note trivial case: if one or both matrices is filled entirely with zeros, then there is an equality.)

In your real skew symmetric case, ##\mathbf B = \mathbf A^T## and ##\gamma = -1##. And of course in the real symmetric case ##\gamma = 1##
 
  • Like
Likes Pushoam
  • #13
Pushoam said:
Then, for the anti - symmetric matrix, ##tr (A^2)= tr (-AA^T) = - A_{ij}A_{ij}## = negative of the sum of the elements of the matrix. Right?
I missed to write square of the elements.
The corrected one:
Then, for the anti - symmetric matrix, ##tr (A^2)= tr (-AA^T) = - A_{ij}A_{ij}## = negative of the sum of the square of the elements of the matrix.
 
  • #14
I am not very familiar with some of the algebra mentioned, though I don’t think it is very difficult.

However, it seems possible to solve the problem without it knowing all this, though I’m sure it does no harm to know it.

You could write out the eigenvalue equation as a cubic equation. The value of the sums of roots ∑λi is well known. The value of the sum of products of two roots, Σλiλj is well known. From this you could get the sum of squares of roots Σλi2.

I have not looked into it, but from what was being said about symmetry I suspect it would be easy to solve this cubic.
 
Last edited:
  • Like
Likes Pushoam
  • #15
epenguin said:
I am not very familiar with some of the algebra mentioned, though I don’t think it is very difficult.

However, it seems possible to solve the problem without it knowing all this, though I’m sure it does no harm to know it.

You could write out the eigenvalue equation as a cubic equation. The value of the sums of roots ∑λi is well known. The value of the sum of products two roots, Σλiλj is well known. From this you could get the sum of squares of roots Σλi2.

I have not looked into it, but from what was being said about symmetry I suspect it would be easy to solve this cubic.

Sometimes symmetry does not help at all. For example, the matrix
$$A = \pmatrix{1&2&3\\2&4&5\\3&5&6}$$
has eigenvalues that are pretty horrible expressions involving cube roots and arctangents of things involving square roots, and the like.
 
  • Like
Likes jim mcnamara and StoneTemplePython

1. What are eigenvalues and why are they important in calculations?

Eigenvalues are a fundamental concept in linear algebra that represent the scalar values associated with a linear transformation. They are important in calculations because they provide information about the behavior of a system under transformation and can be used to solve systems of equations.

2. How do you calculate eigenvalues?

The process for calculating eigenvalues involves finding the roots of the characteristic polynomial of a given matrix. This can be done by subtracting the identity matrix multiplied by a scalar value from the original matrix and then finding the determinant of the resulting matrix. The roots of this polynomial are the eigenvalues of the original matrix.

3. What is the significance of the eigenvectors associated with eigenvalues?

Eigenvectors are the corresponding vectors to eigenvalues and represent the direction of the linear transformation. They are important because they can be used to understand the behavior of a system under transformation and can also be used to solve systems of equations.

4. How are eigenvalues used in real-world applications?

Eigenvalues have a wide range of applications in fields such as engineering, physics, and economics. They are used in the analysis of complex systems, image and signal processing, and optimization problems. They are also used in machine learning algorithms and data analysis techniques.

5. Can a matrix have more than one set of eigenvalues and eigenvectors?

Yes, a matrix can have multiple sets of eigenvalues and eigenvectors. This is because the eigenvalues and eigenvectors are dependent on the specific transformation being applied and the choice of basis vectors. Different transformations or basis vectors can result in different sets of eigenvalues and eigenvectors for the same matrix.

Similar threads

  • Calculus and Beyond Homework Help
Replies
19
Views
3K
  • Calculus and Beyond Homework Help
Replies
5
Views
2K
  • Calculus and Beyond Homework Help
Replies
13
Views
2K
  • Calculus and Beyond Homework Help
Replies
10
Views
2K
  • Calculus and Beyond Homework Help
Replies
4
Views
2K
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
4
Views
3K
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
3K
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
Back
Top