Eigenvalues/eigenvectors using householder and QR

  • Thread starter mrcaze
  • Start date
In summary, Marcie found that the eigenvalues of a matrix A are the same as the eigenvalues of a matrix B if and only if A is similar to B. If A is not similar to B, then the eigenvalues of A are different from the eigenvalues of B.
  • #1
mrcaze
3
0
Dear Friends,

I need to determinate eigenvalues/eigenvectors using householder and QR. I did the follow steps:
1. Transform A matriz to diagonal matriz using householder. I read that matrices are similar, aren't they?
2. Find eigenvalues/eigenvectors using QR Factorization;
3. Adjust found values using QR Algorithm.

However, eigenvalues/eigenvectors work only on the Av=Lv equation for diagonal matrix, but not for A (original). Is this correct? Why? May I say that the eigenvalues/eigenvectors found are the eigenvalues/eigenvectors of A?

thanks
 
Physics news on Phys.org
  • #2
Hello mrcaze. Welcome to physicsforums!

This is a little outside my expertise, but no one else is answering so I thought I would get things going.

First, could you explain just a little more about what you are doing? For example, is A real and symmetric?
mrcaze said:
Dear Friends,
I need to determinate eigenvalues/eigenvectors using householder and QR. I did the follow steps:
1. Transform A matriz to diagonal matriz using householder. I read that matrices are similar, aren't they?
2. Find eigenvalues/eigenvectors using QR Factorization;
3. Adjust found values using QR Algorithm.

If A is either real and symmetric or complex an Hermitian, then I'm guessing by "transform to diagonal" you mean what many people call tri-diagonal. Also, any similarity transformation (https://en.wikipedia.org/wiki/Matrix_similarity) preserves the eigenvalues and eigenvectors. If you are not familiar with similar matrices, then you probably need to learn more linear algebra before you embark on implementing the QR algorithm. There are a couple of good, free books:
http://joshua.smcvt.edu/linearalgebra/#current_version
http://www.math.brown.edu/~treil/papers/LADW/LADW.html

mrcaze said:
However, eigenvalues/eigenvectors work only on the Av=Lv equation for diagonal matrix, but not for A (original). Is this correct? Why? May I say that the eigenvalues/eigenvectors found are the eigenvalues/eigenvectors of A?

thanks
I'm not sure what you are asking. You can find the eigenvalues and eigenvectors of any square matrix. Once you compute the eigenvalues/eigenvectrs you can check your answer. For exmaple, if you think ##v## is an eigenvector with eigenvalue ##\lambda##, you can compute ##A v## and see how close it is to ##\lambda v##.

jason
 
  • #3
Hello Jason!
First of all, thanks for answer me. Let me explain better my question. I have a symetric matrix (nxn) and I want to compute eigenvalues/eigenvectors using numerical methods, because I need to implement it in C++. I read about this issue and I tried to use householder transformation followed by QR factorization/algorithm. So, I have this original matrix A which becomes a tri-diagonal matrix B after householder transformation. After that I used QR over B to finaly compute eigenvalues/eigenvectors. I implemented the example on pages 295-302, in Computing for Numerical Methods Using Visual C++ book (by Salleh et al.). These pages are avaliable to be read in google books. My computed values are the same as the book, but when I checked in Av=λv formula, worked for only B matrix. Not for A. Why?

tahnks again

marcio
 
  • #4
Your matrices ##A## and ##B## are similar, and while similar matrices have the same eigenvalued the eigenvectors are usually different.
In your case ##B=U A U^{-1}##, where ##U=Q_{n-1} Q_{n-2}\ldots Q_1## is the product of elementary Householder matrices. Note that ##U## is an orthogonal matrix, so ##U^{-1}=U^T = Q_1 Q_2 \ldots Q_{n-1}##.

Using the identity ##B=U A U^{-1}## you can rewrite ##B v=\lambda v## as $$UAU^{-1} v = \lambda v,$$ or equivalently (left multiplying both sides by ##U^{-1}##) $$AU^{-1} v =\lambda U^{-1} v . $$
Thus, to get eigevectors of ##A## you just need to multiply the eigenvectors of ##B## by the matrix ##U^{-1}##.
 
  • Like
Likes FactChecker, HallsofIvy and jasonRF
  • #5
Hawkeye18,

Thanks for adding to this and pointing out my incorrect statement about preserving eigenvectors.

mrcaze,

are you at least getting the correct eigenvalues?

jason
 
  • #6
Jason and Hawkeye,

Yes! The eigenvalues are correct! I thought that eigenvectors should be equal too. I will try to do what Hawkeye said.

Thanks a lot
 

What are eigenvalues and eigenvectors?

Eigenvalues and eigenvectors are concepts in linear algebra that are used to describe the behavior of a linear transformation or linear operator. Eigenvalues are scalar values that represent the amount that an eigenvector is scaled when the linear transformation is applied to it.

What is the Householder transformation?

The Householder transformation is a mathematical operation that is used to transform a given vector into a vector that is orthogonal to a specified direction. It is often used in numerical linear algebra for tasks such as solving linear systems of equations and computing eigenvalues and eigenvectors.

How is the Householder transformation used to compute eigenvalues?

The Householder transformation can be used in conjunction with the QR algorithm to compute eigenvalues and eigenvectors of a matrix. This involves transforming the given matrix into a form that is easier to work with, and then applying the QR algorithm to obtain the eigenvalues and eigenvectors.

What is the QR algorithm?

The QR algorithm is a numerical method for computing the eigenvalues and eigenvectors of a matrix. It is based on the idea of decomposing a matrix into the product of an orthogonal matrix and an upper triangular matrix. By iteratively applying this decomposition, the algorithm can converge to the eigenvalues and eigenvectors of the original matrix.

What are the applications of eigenvalues and eigenvectors?

Eigenvalues and eigenvectors have many applications in mathematics, physics, and engineering. They are used in solving systems of differential equations, analyzing the stability of dynamical systems, and in various areas of data analysis such as principal component analysis and image compression.

Similar threads

  • Linear and Abstract Algebra
Replies
1
Views
811
  • Linear and Abstract Algebra
Replies
2
Views
7K
  • Linear and Abstract Algebra
Replies
3
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
6
Views
17K
  • Linear and Abstract Algebra
Replies
6
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
526
  • Quantum Physics
Replies
2
Views
973
  • Linear and Abstract Algebra
Replies
2
Views
1K
Back
Top