Optimizing Eigenvalues of Matrices: A Creative Approach

  • Thread starter Thread starter elphin
  • Start date Start date
  • Tags Tags
    Matrices Value
AI Thread Summary
The discussion focuses on optimizing the eigenvalues of matrices using creative approaches beyond the standard determinant method. Participants explore the concept of spectral radius and the properties of similar and unitarily equivalent matrices, emphasizing that these relationships can simplify the process of finding eigenvalues. A specific matrix is analyzed, noting its diagonal values and symmetry, which influence the eigenvalue calculations. The conversation highlights the importance of the determinant and trace in determining potential eigenvalues, leading to a solution that matches the expected results. Overall, the thread illustrates a blend of theoretical concepts and practical problem-solving strategies in linear algebra.
elphin
Messages
18
Reaction score
0

Homework Statement


Homework Equations


The Attempt at a Solution



the usual method i.e. det(A - bI) = 0

i get the equation finally as b[3][/SUP] - 75b[2][/SUP] + 1850b -15576 = 0

from this i get b[1][/SUB][2][/SUP] + b[2][/SUB][2][/SUP] + b[3][/SUB][2][/SUP] = 1925 < 1949

is there an easier/more creative method??

forgot to add tags .. don't delete..
 

Attachments

  • 2a.jpg
    2a.jpg
    28.7 KB · Views: 421
Last edited:
Physics news on Phys.org
Another way to look at this: Are you familiar with the fact that every complex square matrix is unitarily equivalent to an upper triangular matrix and the properties that are invariant under such an equivalence?
 
stringy said:
Another way to look at this: Are you familiar with the fact that every complex square matrix is unitarily equivalent to an upper triangular matrix and the properties that are invariant under such an equivalence?

no idea... never heard of it before at all! :smile:
 
Hmmm, perhaps HallsofIvy has something else in mind that is simpler than my plan of attack. I'm not sure how to solve this problem by just looking at the spectral radius though, so I can't help with that.

I'll give you a quick rundown on what I was getting at, and I'll provide a few references...

Are you familiar with the notion of "similar matrices?" If not, the general idea is that two square matrices A,B are similar when there exists an invertible P such that

A = P^{-1}BP.

It can be shown that similar matrices share a lot of properties and we say that these properties are invariant under similarity. This definition may look strange, but if you study the matrix representations of abstract linear transformations with respect to different bases, it turns out that those matrices will be related via this "similarity" equation A=P^{-1}BP. The matrix P is called the "change-of-basis" matrix.

Now, two matrices A,B are unitarily equivalent when there exists a unitary matrix U such that

A = U^* BU.

A unitary matrix U is a matrix whose conjugate transpose U^* is equal to its inverse. So, then, by definition unitarily equivalent matrices ARE similar, but the converse is in general false. So since unitary similarity is a "stronger" condition on matrices in a certain sense, we expect that there will be additional properties that are invariant under unitary similarity in addition to the ones that are invariant under "standard" similarity.

Here's an example. The trace of a matrix is an invariant property under similarity. But for unitary similarity, not only is the trace invariant, but (1) the sum of the squares of the absolute values of ALL the matrix entries is invariant (or, in other words, the http://en.wikipedia.org/wiki/Frobenius_norm#Frobenius_norm" is invariant). So, therefore, we use the fact that ANY matrix is unitarily equivalent to an upper triangular matrix (and where are the eigenvalues on an upper triangular matrix? :smile:) and property (1) to find the bound \sqrt{1949}.

If you want to learn why every matrix is unitarily equivalent to an upper triangular matrix, you can learn it from where I learned it, http://books.google.com/books?id=Pl...CDYQ6AEwBA#v=onepage&q=theorem schur&f=false", on page 79. A proof of property (1) is back on page 73. :biggrin:
 
Last edited by a moderator:
Nice problem! :smile:

I've been puzzling on my own what you can say about this matrix.

Note that the values on the main diagonal are large, while the others are small.
This means the eigenvalues will be close to the values on the main diagonal.

Furthermore, the matrix is *almost* symmetric.
A symmetric matrix is unitarily equivalent to a diagonal matrix that has exactly the eigen values on its main diagonal and the rest are zeroes (whereas an upper triangular matrix has its eigenvalues on the main diagonal).

If the matrix were symmetric, its frobenius norm would be identical to the root of the sum of the squared eigenvalues.
As it is the frobenius norm is just a little higher.

The determinant of the matrix is 15000, which is also the product of the eigenvalues.
Any eigenvalues that are rational numbers, have to be whole numbers that divide 15000 (due to the Rational root theorem).
Since we already know the eigenvalues have to be around 21, 26, and 28, the closest dividers to check are 20, 25, and 30.

Furthermore the sum of the eigenvalues has to be the trace of the matrix, which is 75.
Belo and hold: 20+25+30=75. We have a match! :cool:

So we can quickly find the eigenvalues and calculate the exact value for the root requested.
 
You have the equation for the eigenvalues: b3 - 75b2 + 1850b -15576 = 0. If there are three real roots λ1, λ2, λ3, the equation can be written in the form (b-λ1)(b-λ2)(b-λ3)=0. Do the multiplications and compare the coefficients of each power of b with the ones in the original equation.
You get λ122232 without calculating the λ-s.

ehild
 
One small correction, the equation should be: b3 - 75b2 + 1850b - 15000 = 0. :wink:
 
ehild said:
You get λ122232 without calculating the λ-s.

ehild

Nice! :smile:
I just got what you were hinting at.
It doesn't even matter that the last coefficient is wrong!
 
  • #10
I like Serena said:
Nice! :smile:
I just got what you were hinting at.
It doesn't even matter that the last coefficient is wrong!

You were very fast!
I did not check the equation :shy: Luckily, the last term does not matter.

ehild
 
Back
Top