- #1

- 4

- 0

I'm trying to understand(implement) Wang-Landau's algorithm to calculate the DOS. There are zillions of seemingly identical descriptions on the net, which I thought are straightforward to type into a machine - but when reading the papers(codes) I don't get the main point, i.e. for me it seems as if the procedure should not generate anything else than just a flat DOS independent of the model under consideration.

If I understand correctly, then, at level 'l', and independent of acceptance or rejection, after each update attempt, the algorithm will increment

*both*, the histogram h(E) by 1 and the entropy ln(g(E)) by ln(f)_l, at

*identical*E, where E is either the accepted (new) energy or the old energy (after rejection) ... until h(E) is 'sufficiently' flat. Then one nullifies h(E), sets ln(f)_{l+1} = ln(f)_l/2, and starts the preceding all over again ... until ln(f)_l is 'sufficiently' close to zero. (There is a proper way of how to reject/accept, but I don't see how that is of relevance for my problem.)

So, if h(E)_l is the number of entries in each bin of the histogram at level 'l', then

ln(g(E))_l = h(E)_1*ln(f) + h(E)_2*ln(2)/2 + ... + h(E)_l*ln(f)/l

and since the stopping criterion at each level k is that h(E)_k is

*flat*i.e.

*independent*of E, so must be ln(g(E))_l.

In turn the DOS g(E) will be 'as flat as the histogram'.

What I am getting wrong here?

Thx. for your comments.

wbwb