I've been playing around with some MD simulations, a field not really familiar to me. I put together a code in LAMMPS to simulate a Lennard-Jones fluid and compute the RDF. I get the oscillations one would expect, which is good, but what is surprising is that the minimum of g(r) after the first maximum is still above 1 (where g(r) has been scaled such that a uniform distribution in space would give 1).(adsbygoogle = window.adsbygoogle || []).push({});

I haven't worked in this field too much previously so I'm not sure, but I guess I would have thought that the minimum should be below 1.

Opinions on whether this is more likely to be caused by some numerical issues or maybe the model parameters I choose?

I'm just looking for general suggestions on what could cause this, so I'm not posting the details of my LAMMPS script or my parameters. If people really want to see them I can.

Thanks!

**Physics Forums - The Fusion of Science and Community**

# Radial distribution function question

Have something to add?

- Similar discussions for: Radial distribution function question

Loading...

**Physics Forums - The Fusion of Science and Community**