Entropy and the Uncertainty Principle

Click For Summary
SUMMARY

This discussion centers on the relationship between entropy and the uncertainty principle in quantum mechanics, specifically examining how Shannon entropy applies to observable values of momentum and position. The participants explore the idea of summing Shannon entropies of different observables and question the validity of using Shannon entropy in quantum systems, referencing the work of Edwin Thompson Jaynes and Ariel Caticha. They also discuss alternative measures of entropy, such as those proposed by Anton Zeilinger, and highlight the ambiguity in choosing an appropriate entropy measure for physical systems. The conversation concludes with a reference to Hirschman's uncertainty principle, which indicates that the sum of entropies is not constant but is always greater than a specific value.

PREREQUISITES
  • Understanding of quantum mechanics principles, particularly the uncertainty principle
  • Familiarity with information theory concepts, especially Shannon entropy
  • Knowledge of mathematical constructs related to probability vectors and entropy measures
  • Awareness of current research in maximum entropy methods in physics
NEXT STEPS
  • Research maximum entropy methods in physics as discussed by Ariel Caticha and Edwin Thompson Jaynes
  • Explore alternative entropy measures proposed by Anton Zeilinger and their implications for quantum systems
  • Investigate the Hirschman uncertainty principle and its relevance to entropy in quantum mechanics
  • Read "Decoding the Universe" by Charles Seife for insights on the intersection of information theory and cosmology
USEFUL FOR

This discussion is beneficial for physicists, mathematicians, and researchers interested in the intersection of information theory and quantum mechanics, particularly those exploring entropy measures and their implications in physical theories.

IttyBittyBit
Messages
160
Reaction score
0
I've been reading up on the mathematics of quantum theory, and it's all pretty interesting. I have a background in information theory, so when I read about the uncertainty principle I had an idea. Here it goes: let's say that the possible observable values for momentum are p1, p2, ... , pn and the possible observable values for position are x1, x2, ... , xn. Thus we can establish a probability vector for observing, say, a specific position. For example, if the probability that the observed particle lies at x1 is 0.5 and the probability that it lies at x2 is 0.5, then our vector would be (0.5, 0.5, 0, 0, ... , 0), with the vector being n-dimensional. Now, due to the uncertainty principle, a zero shannon entropy ( -sigma(xilogxi )) for the, say, momentum vector would correspond to a high entropy for the vectors for other observables.

Thus far, everything is obvious. But can we take this further and say that H(observable1) + H(observable2) = k, where k is a constant? If we can, what would determine the constant? I'm thinking log(n), but I could be wrong.
 
Physics news on Phys.org
The general direction in which you are posing questions is interesting.

While I don't think that particular idea of summing shannon entropies in the way you suggest might be of any use, there are a lot of related published on this.

Some of it are even active research, so there are not always any right questions with set answers.

Coming from information theory, I think you will like Ariel Caticha and ET Jaynes.
- http://en.wikipedia.org/wiki/Edwin_Thompson_Jaynes
- http://www.albany.edu/physics/ariel_caticha.htm

They generally work on maximum entropy methods in physics.

Here are also plenty of realted interesting papers
- Quantum models of classical mechanics: maximum entropy packets, http://arxiv.org/abs/0901.0436

I personally think that the more serious issue in all these entropy methods is the choice of entropy measure is really ambigous. The usual arguments for inferring a particular entropy measure are not unquestionable.

If you dig around there are a lot of interesting "information theoretic" angles to physics and QM with various levels of ambition.

/Fredrik
 
Last edited by a moderator:
Thanks for the info. Yes, there seems to be some disagreement over information measures. I also read up on Anton Zeilinger's work in this area, and he claims that shannon entropy is no use for quantum systems and he poses another measure based on sigma((xi - 1/n)^2) if I remember correctly. I'm not convinced though; shannon entropy is a very strong concept mathematically; you'd have to make some pretty big arguments against it for it to lose value.
 
I won't try to motivate, but in short, from my point of view there is a connection between the general case of background independence in physics (ie. background referring to any bakcground structure, not just spacetime properties) and the ergodic hypothesis implicit in your choice of information measure.

The objection to shannon is not mathematical of course. From a pure mathematical point of view, there is no problem.

As I see it, it's not that there is a better measure over shannon, it's rather than the notion of measure is seen in a larger context. What is the physical choice, corresponding to the measure choice?

This is an active research area I think, but I think the information road to physics is great, but perhaps the framework of standard shannon is not the right one.

/Fredrik
 
On the issue of "choice of informatio measure" I disagree with Ariels and Jaynes reasoning.

But they are nevertheless interesting and worth reading, in particular if you're not aware of them. Fortunately one doesn't have to agree on every point to appreciate a paper :)

/Fredrik
 
Ittybitty...You might enjoy reading Charles Seife's book : DECODING THE UNIVERSE

"How the new science of information is explaining everything in the cosmos from our brains to black holes." This is a book for the general public and does not have any math but I found his qualitative insights refreshing.
 
Thanks Naty, but I'm interested in math, not layman science! Not that that kind of stuff is bad, it's just that I feel as if it's a betrayal to both the reader and the subject matter.
 
As a conclusion to the idea put forward in the first post, here is what I found:

http://en.wikipedia.org/wiki/Hirschman_uncertainty

It's basically what I was talking about, except it uses differential entropy instead of discrete entropy. As it turns out, the sum of the entropies is not a constant, however it is always guaranteed to be greater than a specific number.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K