Eigenvector for Large Matrix

In summary: I just need to find a way to approximate the eigenvalues and eigenvectors. Yeah, I figured this out...I just need to find a way to approximate the eigenvalues and eigenvectors.
  • #1
senorbum
8
0
Hi, this is my first post here, so bare with me.

So I need to compute the eigenvectors of a large matrix (1000x1000) to (10000x10000x) or so. However, I already have the eigenvalues and diagonal/superdiagonal form of the matrix. The equation (A-lambda*I)*v = 0, where A is the matrix, lambda is the given eigenvalue, and v is the eigenvector to solve for. However, with such large matrices there would be an absurd amount of equations to solve for, correct? A) Are there any BLAS algorithms to solve for this(seems unlikely) and B) Would it be easier to ignore the eigenvalues that I can get from a different program and just start at the beginning with something like a jacobi transformation? I know I can use programs like MATLAB to calculate these eigenvectors, but my goal is to implement a parallel version to increase the calculation time of the eigenvectors.

If any clarification is needed, please let me know. I will check this thread/forum at least a couple times during the work day (8am-5pm EST)

Thanks,

Joe
 
Physics news on Phys.org
  • #2
Matlab does have parallel computing toolbox and I think the "eig" function can be done in a distributed fashion:
http://www.mathworks.com/products/distriben/supported/parallel.html
 
Last edited by a moderator:
  • #3
csguy said:
Matlab does have parallel computing toolbox and I think the "eig" function can be done in a distributed fashion:
http://www.mathworks.com/products/distriben/supported/parallel.html


I can't use matlab. I'm programming on a GPU.
 
Last edited by a moderator:
  • #4
Iterative methods like Gauss-Seidel are great if [itex](A-\lambda I)[/itex] is symmetric and positive definite, and possibly under other conditions. On the other hand, for a general matrix it is not guaranteed to converge.
 
  • #5
maze said:
Iterative methods like Gauss-Seidel are great if [itex](A-\lambda I)[/itex] is symmetric and positive definite, and possibly under other conditions. On the other hand, for a general matrix it is not guaranteed to converge.

Do you happen to have any resources (such as a good website) that describe this algorithm? The matrix will be square/symmetric so it should converge fine.

Edit: I found some resources on it, but would still find others helpful if you know of any.
Edit2: This method is probably not going to work for me, as it is very serial. I am in need of something that can at least be somewhat parallel. If I can't find anything like that, I might as well just use a MATLAB implementation. What about a Jacobi implementation? The problem I see with it is that there are numerous matrix calculations.

So, the Jacobi method makes sense to me except for one thing. How do you choose x to start with? I'm probably missing something easy here...
 
Last edited:
  • #6
Here's another guy asking us to "bare" with him! This forum is getting downright kinky!
 
  • #7
HallsofIvy said:
Here's another guy asking us to "bare" with him! This forum is getting downright kinky!

I couldn't remember which spelling to use. That and its really hot in an office with 3 computers and 5 monitors running all day every day ;)
 
  • #8
senorbum said:
Edit2: This method is probably not going to work for me, as it is very serial. I am in need of something that can at least be somewhat parallel. If I can't find anything like that, I might as well just use a MATLAB implementation. What about a Jacobi implementation? The problem I see with it is that there are numerous matrix calculations.


Ahh, too bad. I just would like to point out (maybe you already realized this), that it isn't completely serial - although the computation for a single eigenvector is serial, you can compute all the different eigenvectors in parallel.

The more I think about it though, the more I am skeptical if it will work for your matrix. The method would be applied to the matrix [itex]A-\lambda I[/itex] (not A), which is certainly not going to be positive definite. The method could probably be salvaged with some trickery, but I'm not seeing it right now.

The Jacobi method is basically a weaker form of Gauss-Seidel, so I would be surprised if you could get something out of the Jacobi method you couldn't with Gauss-Seidel.
 
  • #9
maze said:
Ahh, too bad. I just would like to point out (maybe you already realized this), that it isn't completely serial - although the computation for a single eigenvector is serial, you can compute all the different eigenvectors in parallel.

The more I think about it though, the more I am skeptical if it will work for your matrix. The method would be applied to the matrix [itex]A-\lambda I[/itex] (not A), which is certainly not going to be positive definite. The method could probably be salvaged with some trickery, but I'm not seeing it right now.

The Jacobi method is basically a weaker form of Gauss-Seidel, so I would be surprised if you could get something out of the Jacobi method you couldn't with Gauss-Seidel.

Yeah, I figured this out as well.

So, the new plan after discussing some things with my adviser is to go a slightly different route, which will hopefully still involve a speedup. Instead of directly speeding up the eigen decomposition, the plan is to hopefully speedup the preconditioning enough to impact the problem.
 
  • #10
The area of parallel diagonalization of large matrices is of great interest. To get started, take a look at: http://www.cse.scitech.ac.uk/arc/diags.shtml . If you have additional questions about the software, contact me privately.

Jim Ritchie
 
Last edited by a moderator:

1. What is an eigenvector for a large matrix?

An eigenvector for a large matrix is a vector that does not change its direction when multiplied by the matrix. It is an important concept in linear algebra and is used in various applications such as data analysis and machine learning.

2. How is an eigenvector calculated for a large matrix?

The calculation of eigenvectors for a large matrix involves finding the eigenvalues of the matrix first, which are the values that satisfy the equation Ax = λx, where A is the matrix, λ is the eigenvalue, and x is the eigenvector. Once the eigenvalues are found, the corresponding eigenvectors can be calculated using various methods such as the power iteration method or the QR algorithm.

3. What is the significance of eigenvectors for large matrices?

Eigenvectors for large matrices are important because they provide a way to simplify complex systems and understand their behavior. They also play a crucial role in finding the principal components of a dataset, which can help reduce its dimensionality and reveal underlying patterns.

4. Can eigenvectors be used to solve systems of linear equations?

Yes, eigenvectors can be used to solve systems of linear equations. In fact, finding the eigenvectors of a matrix is equivalent to finding the basis vectors of its null space, which is the set of all solutions to the system of equations Ax = 0. Therefore, knowing the eigenvectors of a matrix can help in solving systems of linear equations.

5. Are there any real-world applications of eigenvectors for large matrices?

Yes, there are many real-world applications of eigenvectors for large matrices. Some examples include image and signal processing, network analysis, and recommendation systems. Eigenvectors are also commonly used in physics and engineering to study the behavior of complex systems.

Similar threads

  • Linear and Abstract Algebra
Replies
1
Views
805
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
7K
  • Linear and Abstract Algebra
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
5
Views
4K
  • Engineering and Comp Sci Homework Help
Replies
18
Views
2K
  • Advanced Physics Homework Help
Replies
17
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
1K
Replies
2
Views
1K
Replies
22
Views
2K
Back
Top