Entropy and the Uncertainty Principle

Click For Summary

Discussion Overview

The discussion revolves around the relationship between entropy and the uncertainty principle in quantum theory, exploring mathematical frameworks and information measures relevant to these concepts. Participants examine the implications of Shannon entropy and alternative measures in the context of quantum systems.

Discussion Character

  • Exploratory
  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant proposes a relationship between the Shannon entropies of different observables and questions whether H(observable1) + H(observable2) could equal a constant, suggesting it might be log(n).
  • Another participant expresses skepticism about the utility of summing Shannon entropies in the proposed manner but acknowledges the relevance of related research in the field.
  • A participant references Anton Zeilinger's work, which challenges the applicability of Shannon entropy for quantum systems and suggests an alternative measure based on variance.
  • One participant argues that the objection to Shannon entropy is not mathematical but rather concerns the physical context of measure choice in relation to background independence in physics.
  • Another participant expresses disagreement with the reasoning of Ariel Caticha and Jaynes regarding information measures but still finds their work interesting.
  • A suggestion is made to read a book that discusses the role of information in understanding the universe, although one participant clarifies their preference for mathematical content over popular science.
  • A later post introduces the Hirschman uncertainty principle, noting that while the sum of entropies is not constant, it is always greater than a specific number.

Areas of Agreement / Disagreement

Participants exhibit disagreement over the appropriateness of different information measures, particularly the use of Shannon entropy in quantum contexts. There is no consensus on the best approach or measure to use.

Contextual Notes

Participants highlight the ambiguity in choosing an entropy measure and the dependence of conclusions on the definitions and contexts applied. The discussion remains open-ended with unresolved mathematical implications.

IttyBittyBit
Messages
160
Reaction score
0
I've been reading up on the mathematics of quantum theory, and it's all pretty interesting. I have a background in information theory, so when I read about the uncertainty principle I had an idea. Here it goes: let's say that the possible observable values for momentum are p1, p2, ... , pn and the possible observable values for position are x1, x2, ... , xn. Thus we can establish a probability vector for observing, say, a specific position. For example, if the probability that the observed particle lies at x1 is 0.5 and the probability that it lies at x2 is 0.5, then our vector would be (0.5, 0.5, 0, 0, ... , 0), with the vector being n-dimensional. Now, due to the uncertainty principle, a zero shannon entropy ( -sigma(xilogxi )) for the, say, momentum vector would correspond to a high entropy for the vectors for other observables.

Thus far, everything is obvious. But can we take this further and say that H(observable1) + H(observable2) = k, where k is a constant? If we can, what would determine the constant? I'm thinking log(n), but I could be wrong.
 
Physics news on Phys.org
The general direction in which you are posing questions is interesting.

While I don't think that particular idea of summing shannon entropies in the way you suggest might be of any use, there are a lot of related published on this.

Some of it are even active research, so there are not always any right questions with set answers.

Coming from information theory, I think you will like Ariel Caticha and ET Jaynes.
- http://en.wikipedia.org/wiki/Edwin_Thompson_Jaynes
- http://www.albany.edu/physics/ariel_caticha.htm

They generally work on maximum entropy methods in physics.

Here are also plenty of realted interesting papers
- Quantum models of classical mechanics: maximum entropy packets, http://arxiv.org/abs/0901.0436

I personally think that the more serious issue in all these entropy methods is the choice of entropy measure is really ambigous. The usual arguments for inferring a particular entropy measure are not unquestionable.

If you dig around there are a lot of interesting "information theoretic" angles to physics and QM with various levels of ambition.

/Fredrik
 
Last edited by a moderator:
Thanks for the info. Yes, there seems to be some disagreement over information measures. I also read up on Anton Zeilinger's work in this area, and he claims that shannon entropy is no use for quantum systems and he poses another measure based on sigma((xi - 1/n)^2) if I remember correctly. I'm not convinced though; shannon entropy is a very strong concept mathematically; you'd have to make some pretty big arguments against it for it to lose value.
 
I won't try to motivate, but in short, from my point of view there is a connection between the general case of background independence in physics (ie. background referring to any bakcground structure, not just spacetime properties) and the ergodic hypothesis implicit in your choice of information measure.

The objection to shannon is not mathematical of course. From a pure mathematical point of view, there is no problem.

As I see it, it's not that there is a better measure over shannon, it's rather than the notion of measure is seen in a larger context. What is the physical choice, corresponding to the measure choice?

This is an active research area I think, but I think the information road to physics is great, but perhaps the framework of standard shannon is not the right one.

/Fredrik
 
On the issue of "choice of informatio measure" I disagree with Ariels and Jaynes reasoning.

But they are nevertheless interesting and worth reading, in particular if you're not aware of them. Fortunately one doesn't have to agree on every point to appreciate a paper :)

/Fredrik
 
Ittybitty...You might enjoy reading Charles Seife's book : DECODING THE UNIVERSE

"How the new science of information is explaining everything in the cosmos from our brains to black holes." This is a book for the general public and does not have any math but I found his qualitative insights refreshing.
 
Thanks Naty, but I'm interested in math, not layman science! Not that that kind of stuff is bad, it's just that I feel as if it's a betrayal to both the reader and the subject matter.
 
As a conclusion to the idea put forward in the first post, here is what I found:

http://en.wikipedia.org/wiki/Hirschman_uncertainty

It's basically what I was talking about, except it uses differential entropy instead of discrete entropy. As it turns out, the sum of the entropies is not a constant, however it is always guaranteed to be greater than a specific number.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 20 ·
Replies
20
Views
4K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 21 ·
Replies
21
Views
2K