Diagonalization of Gigantic Dense Hermitian Matrices

AI Thread Summary
The discussion centers on the challenges of diagonalizing large dense Hermitian matrices, particularly those sized 25k-by-25k and larger. Standard routines like LAPACK's zheevx() are deemed insufficient due to their slow performance. The RMM-DIIS method, as proposed by Wood and Zunger, is highlighted as a more effective alternative for this task. Participants seek guidance on learning resources and concepts necessary for implementing RMM-DIIS, emphasizing the need for a solid understanding of numerical analysis techniques. The QR algorithm is mentioned as a potential method, with its performance varying based on the specific matrix characteristics. The importance of experimenting with random matrix generation and analyzing computation time versus iteration for medium-sized matrices is noted, suggesting that such data could help predict performance for larger matrices.
Bora
Messages
2
Reaction score
0
Hi there,

This is a question about numerical analysis used particularly in the computational condensed matter or anywhere where one needs to DIAGONALIZE GIGANTIC DENSE HERMITIAN MATRICES.

In order to diagonalize dense Hermitian matrices size of 25k-by-25k and more (e.g. 1e6-by-1e6) it is not enough to use canned routines like zheevx() in LAPACK. It's way too slow. One needs new state of the art routines like RMM-DIIS (Residual Minimisation-Direct Inversion in Iterative Subspace) from Wood, Zunger "A New Method for Diagonalising Large Matrices" [J.Phys.A - Mathematical and General, 18, 1343 (1985)]. Since, I have never written eigensolver before, can anyone with experience give an advice what I need to learn to do progam RMM-DIIS quickly. E.g. to give a list of methods and concepts, with book/chapters. Thanks.
 
Computer science news on Phys.org
I'm not sure if you've considered the QR algorithm, but I have used it in the past and it works well. Here is a short video explaining how to write a simple code for it:
 
Thanks a lot. How does the QR perform with gigantic matrices?
 
It depends on the matrix. Just do a random matrix generation (in MATLAB I think it's with function 'magic') and write the code and see. The computation time really depends on the matrix size and how accurate you want the eigenvalues. If you take data for computation time vs iteration for a few medium size matrices you could probly plot that and extrapolate with a curve fit to predict higher matrix sizes.
 
In my discussions elsewhere, I've noticed a lot of disagreement regarding AI. A question that comes up is, "Is AI hype?" Unfortunately, when this question is asked, the one asking, as far as I can tell, may mean one of three things which can lead to lots of confusion. I'll list them out now for clarity. 1. Can AI do everything a human can do and how close are we to that? 2. Are corporations and governments using the promise of AI to gain more power for themselves? 3. Are AI and transhumans...
Thread 'ChatGPT Examples, Good and Bad'
I've been experimenting with ChatGPT. Some results are good, some very very bad. I think examples can help expose the properties of this AI. Maybe you can post some of your favorite examples and tell us what they reveal about the properties of this AI. (I had problems with copy/paste of text and formatting, so I'm posting my examples as screen shots. That is a promising start. :smile: But then I provided values V=1, R1=1, R2=2, R3=3 and asked for the value of I. At first, it said...

Similar threads

Back
Top