- 406

- 5

At the moment, I'm working on the equations for the atmospheric pressure/density above a planet using Newtonian gravity instead of the more usually seen constant gravity. The paradox appears to be that, under Newtonian gravity, an ideal gas will apparently always escape the planets gravity—or so it seem to me(I am a mathematician not a physicist).

I will now state the problem and the steps and assumptions I have been making.

Firstly; let the mass of the planet be [tex]M[/tex], and let its radius be [tex]R[/tex]. To try and avoid the mass of the atmosphere affecting the problem, I have assumed the planet has no atmosphere, and that a small finite amount of an ideal gas exists only in a thin vertical tube which has one end on the surface of the planet and which goes straight up towards infinity. For simplicity, the entire system is assumed to have the same temperature.

Let [tex]h[/tex] be height above the surface within the tube. Assuming that the mass of the gas in the thin tube does not affect gravity overly much, the gravity at height [tex]h[/tex] in the tube is simply given by

[tex]g(h)=g_0\left(\frac{R}{R+h}\right)^2[/tex]

where [tex]g_0=GM/R^2[/tex] is the surface gravity of the planet at [tex]h=0[/tex]

Next, the gas in the tube is assumed to have reached hydrostatic equilibrium in which case its pressure [tex]P[/tex] and density [tex] \rho[/tex] obey the hydrostatic equation

[tex]

\frac{dP}{dh}=-\rho(h) g(h)

[/tex]

To relate pressure and density, I have assumed that in an ideal gas, these are related by the constant [tex]C=\frac{m}{k_B T}[/tex], which comes from ideal gas theory, with

[tex]\rho=C P[/tex]

(I suspect this assumption may be the problem in the argument.)

Then, the density of the gas in the tube obeys the equation

[tex]

\frac{d\rho}{dh}= - C \rho(h) g_0 \left(\frac{R}{R+h}\right)^2

[/tex]

This equation is separable and can be integrated directly and eventually gives

[tex]

\rho(h) = A e^{C g_0 R^2 \frac{1}{R+h}}

[/tex]

Assuming that the initial density of the gas at [tex]h=0[/tex] is [tex]\rho_0[/tex], this gives

[tex]

\rho(h) = \rho_0 e^{-C g_0 R} e^{C g_o R^2 \left(\frac{1}{R+h}\right)}

[/tex]

And with a little algebra this becomes

[tex]

\rho(h) = \rho_0 exp\left(-C g_o \left(\frac{h}{1+\frac{h}{R}}\right)\right)

[/tex]

This equation seems to be in some sense correct as in the limit as [tex]h/R \rightarrow 0[/tex], the usual atmospheric pressure equation [tex]\rho(h)=\rho_0 exp(-C g_0 h)[/tex] is recovered.

However, the "paradox" is that in the limit as [tex]h \rightarrow \infty[/tex], it can be seen that

[tex]

\lim_{h \rightarrow \infty} \rho(h) = \rho_0 exp(-C g_0)

[/tex]

Which is a constant value, implying that the volume of gas in the tube is infinite. So the only way the equation can properly hold is for [tex]\rho_0=0[/tex], that is, the gas has spread out and "escaped" out to infinity.

This is the essence of the "paradox" I have encountered. I suspect it is a result of mistaken physical assumption on my part, but I do not know which ones. Attempting to enumerate the (known) assumptions in order, they have been

1. The entire system has the same temperature.

2. The gas in the tube does not affect gravity in the problem.

3. The gas has reached hydrostatic equilibrium.

4. The (ideal) gas has a linear relationship with pressure [tex]\rho=C P[/tex]

Of these assumptions, I would suspect that either 2 or 4 are incorrect or must be modified. I would appreciate any insight that a physicist, cosmologist or other expert in this field could give to this problem. Thanks in advance for your time.