Entropy and the Uncertainty Principle

In summary: I'm not quite sure what that number is, but it's something to think about.In summary, based on the mathematics of quantum theory, it is possible to establish a probability vector for observing a specific position. However, due to the uncertainty principle, this would correspond to a high entropy for the vectors for other observables. It is an active research area to find a way to calculate the constant associated with this entropy sum.
  • #1
IttyBittyBit
160
0
I've been reading up on the mathematics of quantum theory, and it's all pretty interesting. I have a background in information theory, so when I read about the uncertainty principle I had an idea. Here it goes: let's say that the possible observable values for momentum are p1, p2, ... , pn and the possible observable values for position are x1, x2, ... , xn. Thus we can establish a probability vector for observing, say, a specific position. For example, if the probability that the observed particle lies at x1 is 0.5 and the probability that it lies at x2 is 0.5, then our vector would be (0.5, 0.5, 0, 0, ... , 0), with the vector being n-dimensional. Now, due to the uncertainty principle, a zero shannon entropy ( -sigma(xilogxi )) for the, say, momentum vector would correspond to a high entropy for the vectors for other observables.

Thus far, everything is obvious. But can we take this further and say that H(observable1) + H(observable2) = k, where k is a constant? If we can, what would determine the constant? I'm thinking log(n), but I could be wrong.
 
Physics news on Phys.org
  • #2
The general direction in which you are posing questions is interesting.

While I don't think that particular idea of summing shannon entropies in the way you suggest might be of any use, there are a lot of related published on this.

Some of it are even active research, so there are not always any right questions with set answers.

Coming from information theory, I think you will like Ariel Caticha and ET Jaynes.
- http://en.wikipedia.org/wiki/Edwin_Thompson_Jaynes
- http://www.albany.edu/physics/ariel_caticha.htm

They generally work on maximum entropy methods in physics.

Here are also plenty of realted interesting papers
- Quantum models of classical mechanics: maximum entropy packets, http://arxiv.org/abs/0901.0436

I personally think that the more serious issue in all these entropy methods is the choice of entropy measure is really ambigous. The usual arguments for inferring a particular entropy measure are not unquestionable.

If you dig around there are a lot of interesting "information theoretic" angles to physics and QM with various levels of ambition.

/Fredrik
 
Last edited by a moderator:
  • #3
Thanks for the info. Yes, there seems to be some disagreement over information measures. I also read up on Anton Zeilinger's work in this area, and he claims that shannon entropy is no use for quantum systems and he poses another measure based on sigma((xi - 1/n)^2) if I remember correctly. I'm not convinced though; shannon entropy is a very strong concept mathematically; you'd have to make some pretty big arguments against it for it to lose value.
 
  • #4
I won't try to motivate, but in short, from my point of view there is a connection between the general case of background independence in physics (ie. background referring to any bakcground structure, not just spacetime properties) and the ergodic hypothesis implicit in your choice of information measure.

The objection to shannon is not mathematical of course. From a pure mathematical point of view, there is no problem.

As I see it, it's not that there is a better measure over shannon, it's rather than the notion of measure is seen in a larger context. What is the physical choice, corresponding to the measure choice?

This is an active research area I think, but I think the information road to physics is great, but perhaps the framework of standard shannon is not the right one.

/Fredrik
 
  • #5
On the issue of "choice of informatio measure" I disagree with Ariels and Jaynes reasoning.

But they are nevertheless interesting and worth reading, in particular if you're not aware of them. Fortunately one doesn't have to agree on every point to appreciate a paper :)

/Fredrik
 
  • #6
Ittybitty...You might enjoy reading Charles Seife's book : DECODING THE UNIVERSE

"How the new science of information is explaining everything in the cosmos from our brains to black holes." This is a book for the general public and does not have any math but I found his qualitative insights refreshing.
 
  • #7
Thanks Naty, but I'm interested in math, not layman science! Not that that kind of stuff is bad, it's just that I feel as if it's a betrayal to both the reader and the subject matter.
 
  • #8
As a conclusion to the idea put forward in the first post, here is what I found:

http://en.wikipedia.org/wiki/Hirschman_uncertainty

It's basically what I was talking about, except it uses differential entropy instead of discrete entropy. As it turns out, the sum of the entropies is not a constant, however it is always guaranteed to be greater than a specific number.
 

1. What is entropy?

Entropy is a scientific concept that refers to the measure of disorder or randomness in a system. It is a fundamental principle in thermodynamics and is used to describe the direction and magnitude of energy flow and transformations in a system.

2. How does entropy relate to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system always increases over time. This means that in any energy transfer or transformation, some energy will be lost as heat and increase the overall disorder or randomness in the system.

3. What is the uncertainty principle?

The uncertainty principle, also known as Heisenberg's uncertainty principle, is a fundamental principle in quantum mechanics that states that it is impossible to know the exact position and momentum of a subatomic particle at the same time. This is due to the inherent uncertainty and unpredictability of quantum particles.

4. How do entropy and the uncertainty principle relate to each other?

Entropy and the uncertainty principle are both fundamental principles in physics that deal with the unpredictability and disorder in systems. In thermodynamics, entropy describes the increase in disorder over time, while the uncertainty principle describes the inherent unpredictability of subatomic particles.

5. How are entropy and the uncertainty principle applied in real-world scenarios?

Entropy and the uncertainty principle have various applications in fields such as thermodynamics, quantum mechanics, and information theory. In thermodynamics, entropy is used to analyze and predict the direction and magnitude of energy flow and transformations. In quantum mechanics, the uncertainty principle is used to understand the behavior of subatomic particles. In information theory, both principles are used to analyze and measure the information content and randomness of a system.

Similar threads

Replies
1
Views
816
  • Quantum Physics
Replies
17
Views
1K
Replies
3
Views
402
Replies
3
Views
957
  • Quantum Physics
Replies
6
Views
1K
Replies
10
Views
1K
Replies
8
Views
2K
Replies
10
Views
1K
  • Quantum Physics
Replies
3
Views
257
Replies
21
Views
1K
Back
Top