I haven't been able to figure out how to approach this. It is actually a real-world problem.but I have simplified it a bit to make it easier to explain. A container with helium gas of pressure Pc in it is located in some environment where the surrounding air is at a pressure is Pe and temperature Te (lets say 1 Bar and 300K). The container has a small hole in it where gas can exit/enter to/from the environment A gas cylinder of helium is attached to the container and a valve can be used to let new helium into the container The walls of the container are held at a very low temperature (4K) and any air that enters it will instantly freeze. A controller connected to the valve is used to stabilize the pressure in the container so that we always have Pc>Pe, i.e. there is a slight over-pressure (say 40 mBar). Now, the question: How large should the difference Pc-Pe be to prevent ANY air from entering he container (which is the goal)? The "obvious" answer is that not air will enter the container as long as Pc>Pe., the gas in the container will be leaking out through the hole (and the gas is then replenished from the cylinder) but no air will go the "wrong way" However, I wonder if that is really true? My thinking is the air molecule velocity follows some distribution, and that if the pressure difference too small there might be a chance that a "fast" molecule of air (.e.g. nitrogen) could enter. If that happens and it then hits a wall it will freeze and never leave again. Hence, I guess the question boils down to what happens at the hole? Does anyone have a (qualitative) answer?