Does "Entropy" play a role in Quantum Physics?
A short answer for a short question. Another question?
Thank you for your fast and kind reply. Please enlight me.
When you enlight a particle behind the 2 slits in the Young experience it is no more in a pure state and its entropy grows.
Entropy appears in many places in quantum physics.
Uncertainty relations have entropic versions which say, for example, that the entropy of the position distribution plus the entropy of the momentum distribution must be larger than some constant. A technically simpler version occurs for spin 1/2 states, e.g. the entropy of the Sz basis distribution plus the entropy of the Sx basis distribution is bounded below by 1 bit. See http://arxiv.org/abs/1001.4668 for other examples and some proofs.
Entropy plays an important role in quantum communication and quantum computation. For example, it is used to determine the amount of resources (Bell pairs) needed to teleport a quantum state from one party to another.
Entropy also plays role in quantum many-body physics where it helps us quantify the amount of entanglement in the quantum state of the system. Different many-body systems can have qualitatively different kinds of entanglement, e.g. as occurs in so-called fractional quantum Hall states.
Entropy is about information. There are remaining problems: information about what? and what is quantum information?
I think information is porportional to the log of the number of possible states, just like in classical systems. A particle with two possible spin states, has 1 bit of information stored in the spin. This is not the same as knowing which spin it has. Knowledge is not the same as information.
I would like to know this too, but as I'm not a physicist, I'd love to have it explained in lay terms. I can guess at the meaning of terms like momentum distribution, spin states, information communication, many-body physics, entanglement, Hall states, a log of states, slit experiments, Young, a classical system, etc, but each guess takes me further from understanding. Is there a way to explain the how entropy affects quantum physics, perhaps using analogies?
As an example, a Youtube video I saw explained that the strings in string theory are like tiny strands of energy. (The visual was very helpful.) If that's so, then are those strands of energy part of or subject to entropy? And how would that work?
If you're interested in how entropy in information theory works, these videos:
are quite good.
In physics, entropy plays a role as a measure of the amount of information (in bits) that you don't know about a system.
In thermodynamics, all you know about a system are its macroscopic properties (total energy, volume, particle number, etc). The entropy in this case is the remaining amount of information (in bits) about the system (down to the state of the last particle). Boltzmann's constant is added so that we can keep our old units of temperature.
In quantum physics, there are entropic uncertainty relations (as mentioned by Physics Monkey). What these relations tell you is that is it not possible to prepare a particle to have a definite position and momentum (so that there is no remaining information about the position and momentum of the particle to be known).
What makes entropic uncertainty relations particularly nice is that you can use them to derive other information based limits in quantum measurement.
As an example, information exclusion relations are derived from entropic uncertainty relations.
What these exclusion relations tell us is that the more a measurement tells you about the position of a particle, the less it can also tell you about its momentum, No matter how clever your measurement, if you learn everything about the position of a particle, you learn nothing about its momentum and vise versa.
we associate one bit to a qbit. Same for another qbit. Suppose now that we have a state which is not a Fock state but a superposition of 1 qbit and 2 qubits.
How can we describe the entropy of this state?
This came out recently. http://arxiv.org/abs/1311.0813
JOHN C. BAEZ AND BLAKE S. POLLARD
In statistical mechanics we can recover Boltzmann’s formula by maximizing entropy subject to a constraint on the expected energy. This raises the question: what is the quantum mechanical analogue of entropy? We give a formula for this quantity, which we call ‘quantropy’. We recover Feynman’s formula from assuming that histories have complex amplitudes, that these amplitudes sum to one, and that the amplitudes give a stationary point of quantropy subject to a constraint on the expected action.
Please check out
Entropy is associated with mixed states, which are essentially when you don't have full information about the state.
Separate names with a comma.