Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Entropy and the Uncertainty Principle

  1. Aug 3, 2009 #1
    I've been reading up on the mathematics of quantum theory, and it's all pretty interesting. I have a background in information theory, so when I read about the uncertainty principle I had an idea. Here it goes: let's say that the possible observable values for momentum are p1, p2, ... , pn and the possible observable values for position are x1, x2, ... , xn. Thus we can establish a probability vector for observing, say, a specific position. For example, if the probability that the observed particle lies at x1 is 0.5 and the probability that it lies at x2 is 0.5, then our vector would be (0.5, 0.5, 0, 0, ... , 0), with the vector being n-dimensional. Now, due to the uncertainty principle, a zero shannon entropy ( -sigma(xilogxi )) for the, say, momentum vector would correspond to a high entropy for the vectors for other observables.

    Thus far, everything is obvious. But can we take this further and say that H(observable1) + H(observable2) = k, where k is a constant? If we can, what would determine the constant? I'm thinking log(n), but I could be wrong.
     
  2. jcsd
  3. Aug 3, 2009 #2

    Fra

    User Avatar

    The general direction in which you are posing questions is interesting.

    While I don't think that particular idea of summing shannon entropies in the way you suggest might be of any use, there are alot of related published on this.

    Some of it are even active research, so there are not always any right questions with set answers.

    Coming from information theory, I think you will like Ariel Caticha and ET Jaynes.
    - http://en.wikipedia.org/wiki/Edwin_Thompson_Jaynes
    - http://www.albany.edu/physics/ariel_caticha.htm [Broken]

    They generally work on maximum entropy methods in physics.

    Here are also plenty of realted interesting papers
    - Quantum models of classical mechanics: maximum entropy packets, http://arxiv.org/abs/0901.0436

    I personally think that the more serious issue in all these entropy methods is the choice of entropy measure is really ambigous. The usual arguments for inferring a particular entropy measure are not unquestionable.

    If you dig around there are alot of interesting "information theoretic" angles to physics and QM with various levels of ambition.

    /Fredrik
     
    Last edited by a moderator: May 4, 2017
  4. Aug 3, 2009 #3
    Thanks for the info. Yes, there seems to be some disagreement over information measures. I also read up on Anton Zeilinger's work in this area, and he claims that shannon entropy is no use for quantum systems and he poses another measure based on sigma((xi - 1/n)^2) if I remember correctly. I'm not convinced though; shannon entropy is a very strong concept mathematically; you'd have to make some pretty big arguments against it for it to lose value.
     
  5. Aug 3, 2009 #4

    Fra

    User Avatar

    I won't try to motivate, but in short, from my point of view there is a connection between the general case of background independence in physics (ie. background referring to any bakcground structure, not just spacetime properties) and the ergodic hypothesis implicit in your choice of information measure.

    The objection to shannon is not mathematical of course. From a pure mathematical point of view, there is no problem.

    As I see it, it's not that there is a better measure over shannon, it's rather than the notion of measure is seen in a larger context. What is the physical choice, corresponding to the measure choice?

    This is an active research area I think, but I think the information road to physics is great, but perhaps the framework of standard shannon is not the right one.

    /Fredrik
     
  6. Aug 3, 2009 #5

    Fra

    User Avatar

    On the issue of "choice of informatio measure" I disagree with Ariels and Jaynes reasoning.

    But they are nevertheless interesting and worth reading, in particular if you're not aware of them. Fortunately one doesn't have to agree on every point to appreciate a paper :)

    /Fredrik
     
  7. Aug 4, 2009 #6
    Ittybitty....You might enjoy reading Charles Seife's book : DECODING THE UNIVERSE

    "How the new science of information is explaining everything in the cosmos from our brains to black holes." This is a book for the general public and does not have any math but I found his qualitative insights refreshing.
     
  8. Aug 6, 2009 #7
    Thanks Naty, but I'm interested in math, not layman science! Not that that kind of stuff is bad, it's just that I feel as if it's a betrayal to both the reader and the subject matter.
     
  9. Aug 19, 2009 #8
    As a conclusion to the idea put forward in the first post, here is what I found:

    http://en.wikipedia.org/wiki/Hirschman_uncertainty

    It's basically what I was talking about, except it uses differential entropy instead of discrete entropy. As it turns out, the sum of the entropies is not a constant, however it is always guaranteed to be greater than a specific number.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Entropy and the Uncertainty Principle
Loading...