I've been playing around with some MD simulations, a field not really familiar to me. I put together a code in LAMMPS to simulate a Lennard-Jones fluid and compute the RDF. I get the oscillations one would expect, which is good, but what is surprising is that the minimum of g(r) after the first maximum is still above 1 (where g(r) has been scaled such that a uniform distribution in space would give 1). I haven't worked in this field too much previously so I'm not sure, but I guess I would have thought that the minimum should be below 1. Opinions on whether this is more likely to be caused by some numerical issues or maybe the model parameters I choose? I'm just looking for general suggestions on what could cause this, so I'm not posting the details of my LAMMPS script or my parameters. If people really want to see them I can. Thanks!
Since you are looking for general suggestions: - have you looked at and compared the RDF for different temperatures, particularly in the gas-like and in the liquid-like state and the transition states between them? - Why would the minimum be below the value expected for non-interacting particles (assuming that is what you meant with your normalization)? You have an attraction, which should cause the value to be above 1. Also, you have the effect that other nearby particles (whose increased probability to exist are reflected by the first maximum) tend to repulse other particles, causing an anti-correlation. On this level of naivety, this is two competing effects, both of them purely qualitatively. Is there a reason why the 2nd effect should always be stronger than the first?