1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Butterfly effect from .01 degree temperature uncertainty

  1. Jul 21, 2009 #1
    I've always been curious about the boundary between the quantum and classical regimes, and I've often wondered if the weather is not just a chaotic system, but if it had a degree of true randomness, if it is significantly influenced by quantum events. So I tried to calculate the approximate magnitude of quantum effects in the atmosphere.

    Earth's atmosphere contains about 5*10^15 tons of gas. 3/4 of the mas is concentrated within 11km of earth's surface, and the atmosphere gets thinner as one travels further from the surface of the earth. If the earth's radius is 6389.1 km, then this means that the volume of containing 3/4 of the atmosphere's total mass is 5.63*10^18 m^3 If the atmosphere is about 80% N2 and 20% O2, then the average weight of a molecule of gas is 4.78*10^-26 kg. Dividing the total mass by the molecular mass, their are 7.84*10^43 molecules in 3/4 of the atmosphere, and the average spacing between each one is 4.16*10^-9 m in each dimension. So the uncertainty in momentum in each dimension is 1.27*10^-26 kg*m/s (so I believe the total uncertainty per particle is sqrt{([tex]\Delta[/tex]px)2+([tex]\Delta[/tex]py)2+([tex]\Delta[/tex]pz)2} = 2.20*10^-26 kg*m/s). If the average temperature of the atmosphere is 15 degrees C, this corresponds to an uncertainty in the temperature of each particle on the order of .01 K. This nets to an uncertainty in energy on the order of 10^19 Joules, or several gigatons of TNT.

    I'm not quite sure how to interpret this. On the one hand, I'm positive detonating a gigaton explosive would have an effect on the weather. However, what little I know of thermo and QM suggests to me that this energy uncertainty would be infinitesimal and evenly spread out among each particle, and entropy would prevent a cataclysmic event from happening randomly and all at once.

    So would a .01 degree difference in temperature (on any distance scale) have a noticeable effect on the weather (and on what time scale)? I've heard of the butterfly effect, but I'd like to know if the clouds I'm looking at right now would be a different shape if the quantum jostling of their constituents had played out differently.
  2. jcsd
  3. Jul 21, 2009 #2
    I'm afraid you can't apply Heisenberg's Uncertainty like this. The Uncertainty doesn't add like that. The reason macroscopic objects (even particles of air) don't behave quantum mechanically is because of decoherence (wiki it). Decoherence is essentially a name for the fact that as more constituents are added to a system quantum effects tend to blur out. In fact this happens for even very small systems. There are people like Roger Penrose who have notions that emergent chaotic phenomena have their roots in quantum uncertainty (he's a proponent of the brain having quantum character) however from a strictly statistical perspective the error in initial conditions for something like the weather is more than enough to explain the inability to form a predictive model.
  4. Jul 21, 2009 #3
    Please excuse my persistance, but I re-checked my calculations and I think I was off. Now I'm finding that the uncertainty in each particle's temperature is about .3K. It seems too high to me, but I've run through the calculation a couple of times, now, and I'm pretty certain of it. At 15 degrees C, vrms = sqrt{3*k*291K/4.78234*10^-26 kg} = 502.028 m/s, m*vrms = 2.40088*10^-23 kg m/s. (m*vrms +/- 1/2[tex]\Delta[/tex]p)^2/(2*m) = 6.03203*10^-21 J, 6.02104*10^-21 J, difference of 1.09944*10^-23 J. E = 5/2*k*T, so 2/5 * 1.09944*10^-23 * 1/k = .318529K. Maybe someone could corroborate my result, please.

    I don't know how I could construct or mathematically manipulate a wave function for the entire atmosphere, so I admit I'm left guessing. And I understand that, statistically, if I were to actually make the approximately 10^44 individual particle measurements to determine the energy of every particle in the system, the total energy would usually come out close to the expectation value for the system, because each particle has only a small range of possible energy values and the differences would become more likely to average out the more particles I measured. Furthermore, I understand that the differences introduced by the uncertainty principle would never start out greater than about 2/3 of a degree, and (I think) they would never accumulate to greater differences than this because this would require energy transfer from cold particles to hot particles, and reduce entropy.

    So let's say that I'm almost guaranteed to return a result within .000001% of the expectation value for the whole system when I measure every particle. On smaller scales, don't I become more likely to see variations from the average? How small do I have to go before these variations become significant? I'd figure that the movements of individual gas molecules in the air are affected by the uncertainty principle, so how big does the system have to be to blur these effects out?
    Last edited: Jul 21, 2009
  5. Jul 21, 2009 #4
    Unless there's a special context to the system even half a dozen particles will lose their quantum character (thus the extreme difficulty in making a quantum computer). Ultimately before you even consider something like quantum perturbations you have to examine numerical stability. In all pracitcality, chaos theory is not a result so much of the experimental accuracy of the initial conditions but rather the numerical stability of the projection algorithm. I.e. how these small errors are multiplied/compounded with each successive time step. I think you'd be amazed at how numerically unstable even the simplest differential equation can be when computed using a finite element method where the error is nothing but machine epsilon.
  6. Jul 21, 2009 #5


    User Avatar
    Gold Member

    As Starstrider says, the butterfly effect arises (in models) because errors compound if there is any error in measurement of initial conditions. Discrepancies can start small, and then grow to "any size".

    There are two ways that measurement error could then arise. If the world is continuous, then you would have to make your discrete measurement of its state at "some point" with infinite accuracy. Which seems an impossible task as we can only approach such a degree of accurate measurement in practice. So the model assumes a discrete state exists, but we can't get at it via measurement.

    The QM story seems to raise problems of a different kind. Now the QM model says reality itself is uncertain at the fundamental level. So the uncertainty is ontic rather than epistemic (disregarding hidden variable type QM stories). Though the wavefunction - the shape of the uncertain region so to speak - could be a precisely measured state.

    Furthermore, the QM model - if we take the decoherence view which seems most natural - would say that the uncertainty would be dissipated with averaging over scale. So rather than error grow in powerlaw fashion, it would shrink.

    Thus for classical modelling (like deterministic chaos) you would face one sort of error (discrete measurements of continuous initial conditions) which can't be smoothed over by averaging, and another kind of error - QM uncertainty - which can be safely averaged to some continuum limit. Though that would then create the unmeasurable continuous realm that creates the main type of error?

    Well anyway, I think the intuition you have is that as you descend in the task of measuring initial conditions with ever greater accuracy, you will already be worried by one type of classical error - the problem of making discrete observations in a continuous world - and even before you get there, another kind of error would loom to overwhelm you, the QM kind.
  7. Jul 21, 2009 #6
    Not to counteract apeiron's point but predictive chaos models are done on real world computers and the error in these models will still be dominated by machine epsilon round off errors long before a more exotic quantum vs. classical error took hold. Computers don't store real numbers, they store floats and doubles. 32 bits or 64 bits. That's more than enough to account for forecast errors even if the model has problems with reality.
  8. Jul 21, 2009 #7
    Thank you for your replies. I think I understand what you're saying about how inaccuracies in predicting the weather are the result of our inability measure, and that the inability to measure more precisely is a technical problem not of quantum origin. And further, I think I understand apeiron's point, that the fact that the universe has a continuum of states may be of quantum origin and sully all our measurements before we ever make them, even if we had more precise measuring tools.

    I have another question. I haven't studied statistical mechanics yet, but I got to thinking about how a system of many particles, or how many repeated measurements on the same particle, might behave. Let's say the wave function of every particle in a gas is sin(kx). The expectation momentum of the particle is pex. If we measure one particle's momentum, the chances of it being between x1 and x2 is [tex]\int[/tex]N*pex*sin^2(kx) over x1 to x2 (N being a normalizing factor). The chances of two particles being measured in that state, or two successive measurements of the same particle both being in that state, would be [tex]\int[/tex]M*pex*sin^4(kx) over x1 to x2, (M being another normalizing factor) would it not? So with 10^44 particles, the wave function of the entire system measured at any instant would look something like A*sin^10^44(kx), which is practically something like a delta function centered pi/2. So does decoherence arise because the wave function becomes something like a delta function--in other words, because the expectation value becomes extremely likely and all other possibilities vanishingly small?

    So let's say that in order to have a macroscopically noticeable effect on the weather, 10^40 particles in the atmosphere would have to measure between x1 and x2 for 1000 interactions, or "measurements", then the wave function would look like B*sin^2000(kx). If the peak of the function isn't between x1 and x2, then the chance of this happening is extremely small, but isn't it still nonzero? In other words, couldn't quantum fluctuation still possibly have an effect on the weather, even if the chances are one in a googol?
    Last edited: Jul 22, 2009
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook