I've been reading up on the mathematics of quantum theory, and it's all pretty interesting. I have a background in information theory, so when I read about the uncertainty principle I had an idea. Here it goes: let's say that the possible observable values for momentum are p(adsbygoogle = window.adsbygoogle || []).push({}); _{1}, p_{2}, ... , p_{n}and the possible observable values for position are x_{1}, x_{2}, ... , x_{n}. Thus we can establish a probability vector for observing, say, a specific position. For example, if the probability that the observed particle lies at x_{1}is 0.5 and the probability that it lies at x_{2}is 0.5, then our vector would be (0.5, 0.5, 0, 0, ... , 0), with the vector being n-dimensional. Now, due to the uncertainty principle, a zero shannon entropy ( -sigma(x_{i}logx_{i})) for the, say, momentum vector would correspond to a high entropy for the vectors for other observables.

Thus far, everything is obvious. But can we take this further and say that H(observable1) + H(observable2) = k, where k is a constant? If we can, what would determine the constant? I'm thinking log(n), but I could be wrong.

**Physics Forums | Science Articles, Homework Help, Discussion**

Dismiss Notice

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Entropy and the Uncertainty Principle

**Physics Forums | Science Articles, Homework Help, Discussion**