- #1
NeoDevin
- 334
- 2
Does anyone here know of any fast algorithms to diagonize large, symmetric matrices, that are mostly zeros? (by large I mean 300x300 up to several million by several million)
robphy said:For more general matrix types, you might look at LAPACK/BLAS or the Intel MKL.
NeoDevin said:I'm looking at them now. I'm trying to get them to work for me with C++, but not having any luck. I'm using Dev C++ on windows xp. If you have any advice on how to get it working, that would be very helpful.
NeoDevin said:I'm trying to find the lowest eigenvalue (for now, eventually I may need other eigenvalues.). I figured that diagonalizing was the easiest way to go.
If they don't, finding an associated eigenvector is an easy problem: it's just a nullvector of (A - vI).NeoDevin said:I'm working further now, and it seems that I will be needing the eigenvectors corresponding to the lowest eigenvalues. Do these algorithms return these as well? or is there another algorithm to get them from it?
I'm going to look at the algorithms suggested now.
Hurkyl said:If they don't, finding an associated eigenvector is an easy problem: it's just a nullvector of (A - vI).
The Diagonalization algorithm is a mathematical procedure used to find the eigenvalues and eigenvectors of a square matrix. It involves manipulating the matrix to transform it into a simpler form, making it easier to calculate the eigenvalues and eigenvectors.
The algorithm involves finding the eigenvectors of the matrix and using them to create a diagonal matrix with the eigenvalues along the main diagonal. This is achieved by multiplying the original matrix with a matrix of eigenvectors and its inverse. The resulting diagonal matrix contains the eigenvalues of the original matrix.
The algorithm has various applications in fields such as physics, engineering, and computer science. It is commonly used in quantum mechanics to find the energy levels of a system, in signal processing to compress data, and in machine learning for data dimensionality reduction.
No, not all matrices can be diagonalized. The matrix must meet certain criteria, such as being a square matrix and having a sufficient number of linearly independent eigenvectors. Matrices that do not meet these criteria may require different methods for finding eigenvalues and eigenvectors.
One limitation of the algorithm is that it can only be applied to square matrices. Additionally, it may not always be computationally efficient for large matrices, as it involves calculating the inverse of the matrix of eigenvectors. Other methods, such as the power method, may be more suitable for finding eigenvalues and eigenvectors in these cases.