Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Fluctuation in terms of Hilbert space formalsm

  1. Oct 23, 2009 #1
    This will sound like a very amateur question but please read:
    I have been puzzled for a while about the *precise* mathematical meaning of "quantum fluctuation".
    I know what a classical fluctuation is (as found in classical statistical dynamics). I also know what a superposition is. These seem different. A fluctuation sounds definite if unknowably microsopic an complex, whereas a superposition is about a more fundamental indefiniteness.

    I am told that quantum fluctuations were maginified (huh?) and frozen into the nonuniformities that eventually gave rise to galaxies etc. But this makes it sound like quantum fluctuations are definite states of a classical field that are just really realy small. But this can't be right. How does one magnify a superposition and get something definite? Where is the wave collapse.
    It seems to me that the indefiniteness of a quantum state is to be found in the notion of a superposition rather than some definite something hiding at a microscopic scale.

    The idea that the micro-world is just a buzzing swarm of fluctuations sounds too classical. What is fluctuating and in what sense??

    I just can't see what a quantum flucuation could be in terms of the Hilbert space formalism.

    Put another way, what or who collapsed the wave function that led to the definite inhomogeneities that eventually became he galaxies etc?
  2. jcsd
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Can you offer guidance or do you also need help?
Draft saved Draft deleted